Sep 30 13:35:26 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 13:35:26 crc restorecon[4746]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:26 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:27 crc restorecon[4746]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:35:27 crc restorecon[4746]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 13:35:28 crc kubenswrapper[4763]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:35:28 crc kubenswrapper[4763]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 13:35:28 crc kubenswrapper[4763]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:35:28 crc kubenswrapper[4763]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:35:28 crc kubenswrapper[4763]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 13:35:28 crc kubenswrapper[4763]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.238787 4763 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242047 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242068 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242073 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242077 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242081 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242085 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242089 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242094 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242098 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242101 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242105 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242110 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242115 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242121 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242128 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242133 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242138 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242144 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242148 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242153 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242156 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242160 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242164 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242169 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242172 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242176 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242179 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242189 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242194 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242197 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242201 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242205 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242209 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242213 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242217 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242220 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242224 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242228 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242232 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242236 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242239 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242243 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242247 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242250 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242254 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242258 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242261 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242265 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242269 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242274 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242277 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242283 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242286 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242290 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242294 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242297 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242300 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242304 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242308 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242312 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242316 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242321 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242327 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242333 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242338 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242351 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242357 4763 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242363 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242367 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242372 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.242377 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242478 4763 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242488 4763 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242497 4763 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242503 4763 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242511 4763 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242516 4763 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242523 4763 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242530 4763 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242535 4763 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242539 4763 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242545 4763 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242549 4763 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242554 4763 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242558 4763 flags.go:64] FLAG: --cgroup-root="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242562 4763 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242567 4763 flags.go:64] FLAG: --client-ca-file="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242572 4763 flags.go:64] FLAG: --cloud-config="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242577 4763 flags.go:64] FLAG: --cloud-provider="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242582 4763 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242589 4763 flags.go:64] FLAG: --cluster-domain="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242610 4763 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242616 4763 flags.go:64] FLAG: --config-dir="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242621 4763 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242627 4763 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242634 4763 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242638 4763 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242642 4763 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242647 4763 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242652 4763 flags.go:64] FLAG: --contention-profiling="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242656 4763 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242661 4763 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242666 4763 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242671 4763 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242677 4763 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242682 4763 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242686 4763 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242690 4763 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242695 4763 flags.go:64] FLAG: --enable-server="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242700 4763 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242705 4763 flags.go:64] FLAG: --event-burst="100" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242710 4763 flags.go:64] FLAG: --event-qps="50" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242714 4763 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242719 4763 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242724 4763 flags.go:64] FLAG: --eviction-hard="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242730 4763 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242734 4763 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242738 4763 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242744 4763 flags.go:64] FLAG: --eviction-soft="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242748 4763 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242753 4763 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242758 4763 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242763 4763 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242768 4763 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242773 4763 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242777 4763 flags.go:64] FLAG: --feature-gates="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242784 4763 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242788 4763 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242793 4763 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242797 4763 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242802 4763 flags.go:64] FLAG: --healthz-port="10248" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242806 4763 flags.go:64] FLAG: --help="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242811 4763 flags.go:64] FLAG: --hostname-override="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242815 4763 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242819 4763 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242823 4763 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242827 4763 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242831 4763 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242835 4763 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242839 4763 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242843 4763 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242847 4763 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242851 4763 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242855 4763 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242859 4763 flags.go:64] FLAG: --kube-reserved="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242864 4763 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242868 4763 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242874 4763 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242879 4763 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242883 4763 flags.go:64] FLAG: --lock-file="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242889 4763 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242894 4763 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242899 4763 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242908 4763 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242913 4763 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242918 4763 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242922 4763 flags.go:64] FLAG: --logging-format="text" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242926 4763 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242931 4763 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242935 4763 flags.go:64] FLAG: --manifest-url="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242939 4763 flags.go:64] FLAG: --manifest-url-header="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242944 4763 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242948 4763 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242956 4763 flags.go:64] FLAG: --max-pods="110" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242960 4763 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242965 4763 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242969 4763 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242973 4763 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242978 4763 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242982 4763 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242986 4763 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.242998 4763 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243002 4763 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243006 4763 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243011 4763 flags.go:64] FLAG: --pod-cidr="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243015 4763 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243022 4763 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243026 4763 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243032 4763 flags.go:64] FLAG: --pods-per-core="0" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243036 4763 flags.go:64] FLAG: --port="10250" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243041 4763 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243045 4763 flags.go:64] FLAG: --provider-id="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243049 4763 flags.go:64] FLAG: --qos-reserved="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243053 4763 flags.go:64] FLAG: --read-only-port="10255" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243057 4763 flags.go:64] FLAG: --register-node="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243061 4763 flags.go:64] FLAG: --register-schedulable="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243066 4763 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243074 4763 flags.go:64] FLAG: --registry-burst="10" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243078 4763 flags.go:64] FLAG: --registry-qps="5" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243083 4763 flags.go:64] FLAG: --reserved-cpus="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243087 4763 flags.go:64] FLAG: --reserved-memory="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243092 4763 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243097 4763 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243101 4763 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243105 4763 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243109 4763 flags.go:64] FLAG: --runonce="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243113 4763 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243118 4763 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243123 4763 flags.go:64] FLAG: --seccomp-default="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243127 4763 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243132 4763 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243137 4763 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243141 4763 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243145 4763 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243150 4763 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243155 4763 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243159 4763 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243163 4763 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243168 4763 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243172 4763 flags.go:64] FLAG: --system-cgroups="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243176 4763 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243183 4763 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243187 4763 flags.go:64] FLAG: --tls-cert-file="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243191 4763 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243197 4763 flags.go:64] FLAG: --tls-min-version="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243202 4763 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243205 4763 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243210 4763 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243214 4763 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243218 4763 flags.go:64] FLAG: --v="2" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243224 4763 flags.go:64] FLAG: --version="false" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243230 4763 flags.go:64] FLAG: --vmodule="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243236 4763 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243241 4763 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243344 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243349 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243354 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243358 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243363 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243368 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243373 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243377 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243381 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243385 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243388 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243392 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243395 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243400 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243405 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243411 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243416 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243420 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243425 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243429 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243434 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243438 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243443 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243447 4763 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243451 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243454 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243458 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243461 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243465 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243468 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243474 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243478 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243481 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243485 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243489 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243493 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243497 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243500 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243504 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243508 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243512 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243515 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243518 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243522 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243527 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243531 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243535 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243540 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243544 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243548 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243551 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243555 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243559 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243562 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243566 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243569 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243572 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243577 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243581 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243585 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243588 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243592 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243611 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243615 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243619 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243623 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243626 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243631 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243635 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243638 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.243643 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.243655 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.251774 4763 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.251802 4763 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251888 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251897 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251902 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251906 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251911 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251916 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251921 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251925 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251929 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251934 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251938 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251942 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251946 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251950 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251954 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251957 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251961 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251965 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251968 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251972 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251975 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251979 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251982 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251986 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251989 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251993 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.251997 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252001 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252004 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252007 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252011 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252018 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252022 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252026 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252030 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252034 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252038 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252042 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252046 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252050 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252055 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252059 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252064 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252069 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252074 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252080 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252084 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252090 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252096 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252100 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252104 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252108 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252112 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252117 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252121 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252126 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252130 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252135 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252139 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252144 4763 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252148 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252152 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252156 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252163 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252167 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252204 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252208 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252212 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252216 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252219 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252224 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.252232 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252364 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252372 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252376 4763 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252380 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252384 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252388 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252392 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252395 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252401 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252404 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252408 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252412 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252416 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252419 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252424 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252428 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252432 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252436 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252439 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252443 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252447 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252451 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252455 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252459 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252464 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252468 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252472 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252476 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252480 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252483 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252487 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252491 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252496 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252501 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252506 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252511 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252516 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252520 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252524 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252528 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252532 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252536 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252540 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252543 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252547 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252551 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252555 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252558 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252562 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252566 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252570 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252573 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252578 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252582 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252587 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252607 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252680 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252686 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252691 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252695 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252699 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252704 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252708 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252713 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252717 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252721 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252725 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252730 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252733 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252737 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.252742 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.252748 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.252874 4763 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.256327 4763 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.256399 4763 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.257840 4763 server.go:997] "Starting client certificate rotation" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.257860 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.258074 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-14 19:28:07.889679525 +0000 UTC Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.258188 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2549h52m39.631493967s for next certificate rotation Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.283020 4763 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.287107 4763 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.311971 4763 log.go:25] "Validated CRI v1 runtime API" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.363167 4763 log.go:25] "Validated CRI v1 image API" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.365881 4763 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.373316 4763 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-13-30-46-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.373377 4763 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.395493 4763 manager.go:217] Machine: {Timestamp:2025-09-30 13:35:28.392731498 +0000 UTC m=+0.531291823 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:aaaf82b4-c2c0-416a-9ead-be6eb519b6b5 BootID:87cb1e2c-9b8e-4ead-9950-c0bd55b572ab Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a2:19:07 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a2:19:07 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c2:2d:5f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:7d:3a:b1 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:28:90:70 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:58:a6:71 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:a0:62:74 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3e:c2:8b:01:67:c5 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:7a:52:1a:21:45:84 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.395838 4763 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.396072 4763 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.397150 4763 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.397411 4763 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.397461 4763 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.397768 4763 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.397781 4763 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.398338 4763 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.398375 4763 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.398771 4763 state_mem.go:36] "Initialized new in-memory state store" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.399215 4763 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.403820 4763 kubelet.go:418] "Attempting to sync node with API server" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.403852 4763 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.403888 4763 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.403939 4763 kubelet.go:324] "Adding apiserver pod source" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.403958 4763 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.408471 4763 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.409441 4763 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.411784 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.411802 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.411955 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.145:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.411964 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.145:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.412008 4763 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413373 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413408 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413415 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413423 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413433 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413440 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413447 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413459 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413468 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413475 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413493 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.413501 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.415164 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.415791 4763 server.go:1280] "Started kubelet" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.417154 4763 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.417785 4763 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 13:35:28 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.417325 4763 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.421436 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.422085 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.422139 4763 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.422347 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 04:42:17.303125826 +0000 UTC Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.422464 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1359h6m48.880668284s for next certificate rotation Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.423159 4763 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.423245 4763 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.422988 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.145:6443: connect: connection refused" interval="200ms" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.423499 4763 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.423109 4763 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.424479 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.424567 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.145:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.426259 4763 factory.go:55] Registering systemd factory Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.429717 4763 factory.go:221] Registration of the systemd container factory successfully Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.433509 4763 server.go:460] "Adding debug handlers to kubelet server" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.433776 4763 factory.go:153] Registering CRI-O factory Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.433824 4763 factory.go:221] Registration of the crio container factory successfully Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.433951 4763 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.433977 4763 factory.go:103] Registering Raw factory Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.433993 4763 manager.go:1196] Started watching for new ooms in manager Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.434685 4763 manager.go:319] Starting recovery of all containers Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.431161 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.145:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a12d92d1e2fcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 13:35:28.415756235 +0000 UTC m=+0.554316530,LastTimestamp:2025-09-30 13:35:28.415756235 +0000 UTC m=+0.554316530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439136 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439230 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439248 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439267 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439316 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439335 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439351 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439364 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439378 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439389 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439400 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439413 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439425 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439436 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439450 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439461 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439472 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439482 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439493 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439506 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439517 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.439528 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442584 4763 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442638 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442652 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442668 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442683 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442702 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442716 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442730 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442745 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442756 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442766 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442779 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442791 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442808 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442822 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442834 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442847 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442861 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442874 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442888 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442901 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442914 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442933 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442943 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442959 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442976 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442987 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.442998 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443007 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443021 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443044 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443064 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443078 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443098 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443110 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443121 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443132 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443144 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443156 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443170 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443183 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443195 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443205 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443218 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443230 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443240 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443252 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443264 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443275 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443284 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443295 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443306 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443319 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443330 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443342 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443353 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443364 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443375 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443386 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443398 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443410 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443421 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443432 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443445 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443458 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443471 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443482 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443495 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443509 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443526 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443606 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443620 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443632 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443642 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443652 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443663 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443674 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443690 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443701 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443711 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443723 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443733 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443744 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443758 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443770 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443784 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443797 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443808 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443819 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443861 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443875 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443886 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443899 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443910 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443918 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443929 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443940 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443950 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443963 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443973 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443984 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.443994 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444006 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444015 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444030 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444040 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444050 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444063 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444073 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444082 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444094 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444106 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444118 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444130 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444142 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444153 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444164 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444175 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444186 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444197 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444209 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444220 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444232 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444248 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444259 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444271 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444281 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444292 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444303 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444314 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444324 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444336 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444347 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444358 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.444367 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445191 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445230 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445241 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445267 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445279 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445297 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445311 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445321 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445339 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445349 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445369 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445381 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445393 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445409 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445422 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445439 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445453 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445467 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445490 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445504 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445520 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445533 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445547 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445566 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445577 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.445725 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446178 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446203 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446229 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446243 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446265 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446279 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446293 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446313 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446330 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446345 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446367 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446381 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446402 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446415 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446428 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446443 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446461 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446483 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446499 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446515 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446537 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446555 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446574 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446587 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446616 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446626 4763 reconstruct.go:97] "Volume reconstruction finished" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.446635 4763 reconciler.go:26] "Reconciler: start to sync state" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.452813 4763 manager.go:324] Recovery completed Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.463716 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.465812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.465944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.466015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.467484 4763 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.467510 4763 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.467541 4763 state_mem.go:36] "Initialized new in-memory state store" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.485791 4763 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.486119 4763 policy_none.go:49] "None policy: Start" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.487405 4763 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.487460 4763 state_mem.go:35] "Initializing new in-memory state store" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.488050 4763 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.488106 4763 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.488146 4763 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.488210 4763 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.490007 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.490169 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.145:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.524285 4763 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.549119 4763 manager.go:334] "Starting Device Plugin manager" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.549378 4763 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.549398 4763 server.go:79] "Starting device plugin registration server" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.549920 4763 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.549937 4763 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.550576 4763 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.550684 4763 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.550691 4763 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.558310 4763 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.589040 4763 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.589191 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.591027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.591070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.591085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.591226 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.592057 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.592124 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.592766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.593027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.593213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.593700 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.593752 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.593707 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.594043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.594072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.594082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.595030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.595062 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.595075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.595223 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.595331 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.595359 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.595707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.596009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.596225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.596108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.596561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.596574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.596720 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.596444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.597011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.597019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.597169 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.597186 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.597771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.597794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.597802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.597903 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.597922 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.598198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.598210 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.598217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.598608 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.598636 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.598649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.624683 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.145:6443: connect: connection refused" interval="400ms" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.648503 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.648713 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.648830 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.648927 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.648999 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.649090 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.649318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.649556 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.649625 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.649673 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.649698 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.649721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.649795 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.649873 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.649962 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.650058 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.652425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.652474 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.652490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.652563 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.653591 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.145:6443: connect: connection refused" node="crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752075 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752149 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752183 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752214 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752257 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752293 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752320 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752350 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752381 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752367 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752418 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752411 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752516 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752521 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752566 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752583 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752631 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752664 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752677 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752710 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752667 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752693 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752566 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752424 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752372 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752778 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752818 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.752880 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.753033 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.854334 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.856882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.856946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.856970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.857015 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:35:28 crc kubenswrapper[4763]: E0930 13:35:28.857765 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.145:6443: connect: connection refused" node="crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.933325 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.948217 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.970276 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.980942 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: I0930 13:35:28.987187 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.991377 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-0f7cb30fe56aa79f5bf7d047a9aaf00adac1a8b9cee3251d34aa3ecb59924adf WatchSource:0}: Error finding container 0f7cb30fe56aa79f5bf7d047a9aaf00adac1a8b9cee3251d34aa3ecb59924adf: Status 404 returned error can't find the container with id 0f7cb30fe56aa79f5bf7d047a9aaf00adac1a8b9cee3251d34aa3ecb59924adf Sep 30 13:35:28 crc kubenswrapper[4763]: W0930 13:35:28.996258 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-295b12207691670f733882cf1ac928a9b9b1d977593994fe726d4ed9f7356194 WatchSource:0}: Error finding container 295b12207691670f733882cf1ac928a9b9b1d977593994fe726d4ed9f7356194: Status 404 returned error can't find the container with id 295b12207691670f733882cf1ac928a9b9b1d977593994fe726d4ed9f7356194 Sep 30 13:35:29 crc kubenswrapper[4763]: W0930 13:35:29.005221 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2192fdbc6c3b98a01ccc22282b7cbb9f08eddada835164c45f558a38e0ad857e WatchSource:0}: Error finding container 2192fdbc6c3b98a01ccc22282b7cbb9f08eddada835164c45f558a38e0ad857e: Status 404 returned error can't find the container with id 2192fdbc6c3b98a01ccc22282b7cbb9f08eddada835164c45f558a38e0ad857e Sep 30 13:35:29 crc kubenswrapper[4763]: E0930 13:35:29.026677 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.145:6443: connect: connection refused" interval="800ms" Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.258560 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.260564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.260660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.260677 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.260710 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:35:29 crc kubenswrapper[4763]: E0930 13:35:29.261332 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.145:6443: connect: connection refused" node="crc" Sep 30 13:35:29 crc kubenswrapper[4763]: W0930 13:35:29.419454 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:29 crc kubenswrapper[4763]: E0930 13:35:29.419563 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.145:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.422592 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.492776 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b8a9e41e5a59b597a4ccbee7e62411ea8e4045d8d2e075eac7c7868b4b371953"} Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.496617 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a1f26c3e6dca366b619d39f97a9f184f60d2c6829be14a8b28f7cb84c24c9bd5"} Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.498620 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2192fdbc6c3b98a01ccc22282b7cbb9f08eddada835164c45f558a38e0ad857e"} Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.499513 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"295b12207691670f733882cf1ac928a9b9b1d977593994fe726d4ed9f7356194"} Sep 30 13:35:29 crc kubenswrapper[4763]: W0930 13:35:29.499518 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:29 crc kubenswrapper[4763]: E0930 13:35:29.499615 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.145:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:35:29 crc kubenswrapper[4763]: I0930 13:35:29.500709 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0f7cb30fe56aa79f5bf7d047a9aaf00adac1a8b9cee3251d34aa3ecb59924adf"} Sep 30 13:35:29 crc kubenswrapper[4763]: W0930 13:35:29.551385 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:29 crc kubenswrapper[4763]: E0930 13:35:29.551781 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.145:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:35:29 crc kubenswrapper[4763]: W0930 13:35:29.562780 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:29 crc kubenswrapper[4763]: E0930 13:35:29.562900 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.145:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:35:29 crc kubenswrapper[4763]: E0930 13:35:29.827989 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.145:6443: connect: connection refused" interval="1.6s" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.062226 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.064182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.064219 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.064229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.064256 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:35:30 crc kubenswrapper[4763]: E0930 13:35:30.064799 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.145:6443: connect: connection refused" node="crc" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.422966 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.507687 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef"} Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.507764 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.507770 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898"} Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.508302 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e"} Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.508328 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3"} Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.509104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.509137 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.509147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.510721 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298" exitCode=0 Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.510842 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298"} Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.510932 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.512074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.512126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.512144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.513190 4763 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ed9cc2883dbd039b8ec767fb752d1fa8c5533f80bf47d819598a6ac173959563" exitCode=0 Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.513241 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ed9cc2883dbd039b8ec767fb752d1fa8c5533f80bf47d819598a6ac173959563"} Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.513334 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.514999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.515035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.515054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.518261 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.519170 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9c2e132a8990ee488ebe44e62e3685a9315406d28b0f25ffc9b9b7c2c5e37fc7" exitCode=0 Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.519306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9c2e132a8990ee488ebe44e62e3685a9315406d28b0f25ffc9b9b7c2c5e37fc7"} Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.519382 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.520051 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.520100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.520128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.520984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.521030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.521047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.522760 4763 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb" exitCode=0 Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.522829 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb"} Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.522980 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.525803 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.525874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:30 crc kubenswrapper[4763]: I0930 13:35:30.525888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:31 crc kubenswrapper[4763]: W0930 13:35:31.181806 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:31 crc kubenswrapper[4763]: E0930 13:35:31.181923 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.145:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:35:31 crc kubenswrapper[4763]: W0930 13:35:31.364480 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:31 crc kubenswrapper[4763]: E0930 13:35:31.364634 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.145:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.422646 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.145:6443: connect: connection refused Sep 30 13:35:31 crc kubenswrapper[4763]: E0930 13:35:31.429290 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.145:6443: connect: connection refused" interval="3.2s" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.528986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f"} Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.529048 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a"} Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.529063 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745"} Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.529076 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd"} Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.530634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dab73a8bdd6e33eb58327d87ab56400b259379d650b5e5f3b3c51e64d22beb1a"} Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.530753 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.531802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.531846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.531859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.533757 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="680167ab3e00fcfd6baecf1531e26cea50fca3a45e6be6c6fa41123ce0e02b58" exitCode=0 Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.533838 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"680167ab3e00fcfd6baecf1531e26cea50fca3a45e6be6c6fa41123ce0e02b58"} Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.533932 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.535148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.535183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.535196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.536997 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c471b7d5edb6a2a0a1b7df018d846b4d54af48c83aa59b0067b9a98be067aa96"} Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.537030 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.537046 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f882363620739f8700024600e56bc55742489a500c06f523fb9028dd2af5941f"} Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.537064 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"69768d25767b2d069d78da62764bb2be0c6c1c8f9b4378c499a20d0324fdb7c1"} Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.537022 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.537811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.537839 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.537854 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.537956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.538013 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.538029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.665769 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.667636 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.667711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.667723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:31 crc kubenswrapper[4763]: I0930 13:35:31.667777 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:35:31 crc kubenswrapper[4763]: E0930 13:35:31.668735 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.145:6443: connect: connection refused" node="crc" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.542347 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a73d14c3d9d5df2cfb202054a5f53d2aae6626001c7c94c1a4afd4e4a4c4203f" exitCode=0 Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.542428 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a73d14c3d9d5df2cfb202054a5f53d2aae6626001c7c94c1a4afd4e4a4c4203f"} Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.542452 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.543621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.543664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.543675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.547251 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5"} Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.547282 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.547320 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.547352 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.547382 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.548253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.548284 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.548294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.548304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.548337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.548346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.548500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.548533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:32 crc kubenswrapper[4763]: I0930 13:35:32.548545 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.174944 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.555176 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.555328 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.556734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.556759 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.556771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.557018 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.557436 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.557719 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"272579ed8ae5590447af8815433a129c1d026c8f2fdfcf18c88c21db1e506188"} Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.558589 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0d9028a2b7f8b4c6f9ccb6bf1102f8da76012f0a190eb1c1b9d76ff43f0c5438"} Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.558680 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"439019841fd82c2a82ffce416c2757dc351fe196cef7507f652847ca5e44f925"} Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.558705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9e3ff6f745a2555aefb1f26728ec8d1c63236f2957602682f42ae6dfccf7e7fb"} Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.558729 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"73a3bd9a0a04f5f323cf8f6d4756e24919b30921c63eb5811ba870aac537e17d"} Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.561557 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.561619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.561633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.561716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.561765 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.561799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.566591 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:33 crc kubenswrapper[4763]: I0930 13:35:33.989342 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.559251 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.559306 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.559923 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.562667 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.562692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.562715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.562723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.562734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.562737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.563727 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.563766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.563780 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.847851 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.848620 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.850089 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.850118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.850129 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.869009 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.870478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.870570 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.870647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:34 crc kubenswrapper[4763]: I0930 13:35:34.870714 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:35:35 crc kubenswrapper[4763]: I0930 13:35:35.561555 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:35 crc kubenswrapper[4763]: I0930 13:35:35.562882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:35 crc kubenswrapper[4763]: I0930 13:35:35.562941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:35 crc kubenswrapper[4763]: I0930 13:35:35.562952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:35 crc kubenswrapper[4763]: I0930 13:35:35.636989 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.564532 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.565773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.565805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.565815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.681514 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.681795 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.683170 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.683208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.683266 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.980226 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.980401 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.980438 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.981746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.981823 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:36 crc kubenswrapper[4763]: I0930 13:35:36.981843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:37 crc kubenswrapper[4763]: I0930 13:35:37.226923 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:37 crc kubenswrapper[4763]: I0930 13:35:37.567067 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:35:37 crc kubenswrapper[4763]: I0930 13:35:37.567120 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:37 crc kubenswrapper[4763]: I0930 13:35:37.568063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:37 crc kubenswrapper[4763]: I0930 13:35:37.568104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:37 crc kubenswrapper[4763]: I0930 13:35:37.568118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:38 crc kubenswrapper[4763]: E0930 13:35:38.558417 4763 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 13:35:38 crc kubenswrapper[4763]: I0930 13:35:38.597407 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:38 crc kubenswrapper[4763]: I0930 13:35:38.597780 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:38 crc kubenswrapper[4763]: I0930 13:35:38.599695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:38 crc kubenswrapper[4763]: I0930 13:35:38.599773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:38 crc kubenswrapper[4763]: I0930 13:35:38.599796 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:39 crc kubenswrapper[4763]: I0930 13:35:39.981242 4763 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:35:39 crc kubenswrapper[4763]: I0930 13:35:39.981411 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:35:42 crc kubenswrapper[4763]: W0930 13:35:42.062773 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 13:35:42 crc kubenswrapper[4763]: I0930 13:35:42.062874 4763 trace.go:236] Trace[760197439]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:35:32.060) (total time: 10002ms): Sep 30 13:35:42 crc kubenswrapper[4763]: Trace[760197439]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (13:35:42.062) Sep 30 13:35:42 crc kubenswrapper[4763]: Trace[760197439]: [10.002378388s] [10.002378388s] END Sep 30 13:35:42 crc kubenswrapper[4763]: E0930 13:35:42.062933 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 13:35:42 crc kubenswrapper[4763]: I0930 13:35:42.239248 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 13:35:42 crc kubenswrapper[4763]: I0930 13:35:42.239366 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 13:35:42 crc kubenswrapper[4763]: I0930 13:35:42.251381 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 13:35:42 crc kubenswrapper[4763]: I0930 13:35:42.251446 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.176745 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.176857 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.511619 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.511856 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.513184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.513210 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.513219 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.557669 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.588055 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.589147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.589185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.589197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:43 crc kubenswrapper[4763]: I0930 13:35:43.601264 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 13:35:44 crc kubenswrapper[4763]: I0930 13:35:44.590682 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:44 crc kubenswrapper[4763]: I0930 13:35:44.591776 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:44 crc kubenswrapper[4763]: I0930 13:35:44.591820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:44 crc kubenswrapper[4763]: I0930 13:35:44.591887 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:44 crc kubenswrapper[4763]: I0930 13:35:44.995687 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Sep 30 13:35:44 crc kubenswrapper[4763]: I0930 13:35:44.995761 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Sep 30 13:35:45 crc kubenswrapper[4763]: I0930 13:35:45.292507 4763 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 13:35:45 crc kubenswrapper[4763]: I0930 13:35:45.642913 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:45 crc kubenswrapper[4763]: I0930 13:35:45.643158 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:45 crc kubenswrapper[4763]: I0930 13:35:45.643656 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Sep 30 13:35:45 crc kubenswrapper[4763]: I0930 13:35:45.643749 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Sep 30 13:35:45 crc kubenswrapper[4763]: I0930 13:35:45.644392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:45 crc kubenswrapper[4763]: I0930 13:35:45.644430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:45 crc kubenswrapper[4763]: I0930 13:35:45.644440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:45 crc kubenswrapper[4763]: I0930 13:35:45.647424 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:46 crc kubenswrapper[4763]: I0930 13:35:46.599890 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:46 crc kubenswrapper[4763]: I0930 13:35:46.600678 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Sep 30 13:35:46 crc kubenswrapper[4763]: I0930 13:35:46.600724 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Sep 30 13:35:46 crc kubenswrapper[4763]: I0930 13:35:46.600957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:46 crc kubenswrapper[4763]: I0930 13:35:46.600993 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:46 crc kubenswrapper[4763]: I0930 13:35:46.601006 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.221039 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.224174 4763 trace.go:236] Trace[402053728]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:35:35.316) (total time: 11907ms): Sep 30 13:35:47 crc kubenswrapper[4763]: Trace[402053728]: ---"Objects listed" error: 11907ms (13:35:47.224) Sep 30 13:35:47 crc kubenswrapper[4763]: Trace[402053728]: [11.907786254s] [11.907786254s] END Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.224212 4763 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.226074 4763 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.226084 4763 trace.go:236] Trace[500284559]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:35:35.043) (total time: 12182ms): Sep 30 13:35:47 crc kubenswrapper[4763]: Trace[500284559]: ---"Objects listed" error: 12182ms (13:35:47.225) Sep 30 13:35:47 crc kubenswrapper[4763]: Trace[500284559]: [12.182347379s] [12.182347379s] END Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.226113 4763 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.226360 4763 trace.go:236] Trace[357112924]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:35:32.278) (total time: 14947ms): Sep 30 13:35:47 crc kubenswrapper[4763]: Trace[357112924]: ---"Objects listed" error: 14947ms (13:35:47.226) Sep 30 13:35:47 crc kubenswrapper[4763]: Trace[357112924]: [14.94754072s] [14.94754072s] END Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.226373 4763 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.227161 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.293070 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.299631 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.303000 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.416667 4763 apiserver.go:52] "Watching apiserver" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.419339 4763 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.419701 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.420285 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.420404 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.420464 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.420518 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.420570 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.420482 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.420792 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.420945 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.420702 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.423236 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.423398 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.423479 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.423489 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.423641 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.424075 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.424486 4763 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.425430 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.426397 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.426457 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428107 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428142 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428170 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428197 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428232 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428255 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428279 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428297 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428317 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428334 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428349 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428369 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428389 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428405 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428420 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428436 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428455 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428477 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428493 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428513 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428528 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428530 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428544 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428641 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428670 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428673 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428695 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428715 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428738 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428762 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428785 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428820 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428841 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428861 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428875 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428884 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428934 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428962 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.428983 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429002 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429025 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429042 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429041 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429060 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429106 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429124 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429138 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429155 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429171 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429189 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429221 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429239 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429254 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429269 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429288 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429303 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429324 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429344 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429364 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429379 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429395 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429411 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429428 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429447 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429466 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429484 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429500 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429525 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429566 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429583 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429617 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429632 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429675 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429698 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429713 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429738 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429760 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429777 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429792 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429811 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429831 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429847 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429866 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.429888 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430009 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430032 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430047 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430063 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430079 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430111 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430128 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430171 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430191 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430206 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430239 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430254 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430273 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430296 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430325 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430331 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430345 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430350 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430376 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430471 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430500 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430535 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430561 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430562 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430576 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430591 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430676 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430720 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430746 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430770 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430791 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430797 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430861 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430893 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430919 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430954 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430988 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431019 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431052 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431083 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431112 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431142 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431172 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431205 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431237 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431268 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431300 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431335 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431368 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431389 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431409 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431433 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431455 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431476 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431501 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431523 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431574 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431622 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431642 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431659 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431677 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431696 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431714 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431733 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431752 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431773 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431793 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431820 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431839 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431861 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431878 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431898 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431915 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431932 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431952 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431972 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431993 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432012 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432032 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432054 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432073 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432096 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432116 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432136 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432159 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432176 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432195 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432213 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432232 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432255 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432276 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432293 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432315 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432337 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432357 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432376 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432395 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432416 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432434 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432455 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432474 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432494 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432514 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432533 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432551 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432571 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432589 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432627 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432663 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432689 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432712 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432734 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432754 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432772 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432791 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432813 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432831 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432884 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432935 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432996 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433017 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433035 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433055 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433079 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433106 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433150 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433177 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433206 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433229 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433248 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433318 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433331 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433343 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433354 4763 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433366 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433377 4763 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433390 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433403 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433414 4763 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.441754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.430937 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431005 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431111 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431180 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431287 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431399 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431431 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431711 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431768 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431949 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.431964 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432196 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432246 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432302 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432350 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432482 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432675 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.432888 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433703 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.433895 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.434130 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.434484 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.434698 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.434750 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.435068 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.435425 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.435829 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.436144 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.436304 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.436392 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.436568 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.436655 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.436692 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.436828 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.437029 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.437138 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.438075 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.438145 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.438334 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.438822 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.449403 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.439439 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.439582 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.439733 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.439978 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.440094 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.440303 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.440528 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.442831 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.443048 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.443571 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.443675 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.445492 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.445709 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.445746 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.445846 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.445935 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.445944 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.445998 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.446346 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.446672 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.447112 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.447494 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.447776 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.448277 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.448557 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.438941 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.449648 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.449836 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.449889 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.450181 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.450391 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.450825 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.450943 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.451464 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.451745 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.452004 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.452027 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.452164 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.452240 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.452512 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.453683 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.453720 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.453799 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.454944 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.455025 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.455070 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.461461 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.455290 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464894 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.455446 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.455441 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.456198 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.456994 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.457001 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.457223 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.465099 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.457450 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.465152 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.457847 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.458071 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.458269 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.458436 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.458540 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.458550 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.458989 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.459088 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.459229 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.459372 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.459786 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.459820 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.460070 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.460047 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.459927 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.460233 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.460421 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.460526 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.460535 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.460645 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.460912 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.460933 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.460949 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.461246 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.461192 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.461291 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.461447 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.461456 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.461517 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.461732 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.461845 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.461914 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.462948 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:35:47.946137062 +0000 UTC m=+20.084697347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.463208 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.463302 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.463902 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464096 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464123 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464146 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464287 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464507 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464553 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464616 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464704 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464794 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464877 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464953 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.464726 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.455988 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.465175 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.465335 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.466503 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.466907 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.467012 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.467151 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.467513 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.467573 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:47.967542037 +0000 UTC m=+20.106102322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.467652 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:47.967636819 +0000 UTC m=+20.106197324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.468000 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.468332 4763 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.468802 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.468810 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.472529 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.474205 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.479663 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.479737 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.479757 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.479984 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:47.979930594 +0000 UTC m=+20.118491089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.480872 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.482794 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.482891 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.484798 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.485015 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.485033 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.485044 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.485070 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.485097 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:47.985082347 +0000 UTC m=+20.123642632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.486482 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.487332 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.489103 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.489248 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.489340 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.489648 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.490074 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.490204 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.490509 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.490633 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.492688 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.493007 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.493145 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.493200 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.493757 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.493812 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.493951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.494146 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.494334 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.494383 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.494402 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.494456 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.494651 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.500717 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.500946 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.500983 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.502157 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.504476 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.505520 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.515376 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.515739 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.516043 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.521172 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.531306 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535037 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535229 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535240 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535535 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535559 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535854 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535878 4763 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535892 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535903 4763 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535914 4763 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535925 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535937 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535948 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535958 4763 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535968 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535978 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.535990 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536002 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536014 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536025 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536036 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536046 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536057 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536067 4763 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536077 4763 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536088 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536099 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536110 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536122 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536131 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536143 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536155 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536166 4763 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536177 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536187 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536198 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536208 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536227 4763 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536236 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536246 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536256 4763 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536265 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536277 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536287 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536298 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536311 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536321 4763 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536330 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536341 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536352 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536362 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536372 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536401 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536410 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536420 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536430 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536440 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536449 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536459 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536474 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536485 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536494 4763 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536505 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536516 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536526 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536537 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536547 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536556 4763 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536565 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536574 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536583 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536592 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536614 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536625 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536635 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536645 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536653 4763 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536663 4763 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536672 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536681 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536697 4763 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536707 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536716 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536726 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536735 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536744 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536753 4763 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536762 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536772 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536781 4763 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536791 4763 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536802 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536812 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536822 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536832 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536841 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536851 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536861 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536870 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536881 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536891 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536901 4763 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536917 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.536982 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537011 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537021 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537030 4763 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537039 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537049 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537058 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537087 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537096 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537105 4763 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537115 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537124 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537134 4763 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537162 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537173 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537181 4763 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537199 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537208 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537217 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537245 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537256 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537265 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537276 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537285 4763 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537293 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537320 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537332 4763 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537344 4763 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537358 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537368 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537377 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537407 4763 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537417 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537426 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537436 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537446 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537454 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537482 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537492 4763 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537503 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537516 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537526 4763 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537537 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537568 4763 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537578 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537588 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537632 4763 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537642 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537651 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537661 4763 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537671 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537679 4763 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537709 4763 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537719 4763 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537727 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537735 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537744 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537752 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537760 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537789 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537801 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537809 4763 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537817 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537825 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537834 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537860 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537869 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537878 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537887 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537904 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537913 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537939 4763 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537950 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537959 4763 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537967 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537976 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537985 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.537993 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.538023 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.538032 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.538039 4763 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.538049 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.538058 4763 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.538066 4763 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.538076 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.538104 4763 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.542295 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.555397 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.566072 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.579574 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.598134 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.611177 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.612812 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5" exitCode=255 Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.612899 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5"} Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.618194 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.618712 4763 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.624375 4763 scope.go:117] "RemoveContainer" containerID="44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.625807 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.631862 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.645620 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.659412 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.668997 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.679839 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.696032 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.708832 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.720218 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.737027 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.751990 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:35:47 crc kubenswrapper[4763]: W0930 13:35:47.768548 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c38189e7eb59a465efad8f9b23378f0f3065517110cb7db8d7860687c9551044 WatchSource:0}: Error finding container c38189e7eb59a465efad8f9b23378f0f3065517110cb7db8d7860687c9551044: Status 404 returned error can't find the container with id c38189e7eb59a465efad8f9b23378f0f3065517110cb7db8d7860687c9551044 Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.794273 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:35:47 crc kubenswrapper[4763]: W0930 13:35:47.808041 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d827b6b460543f46c16c84f1f84af6b4b25d3b6699d513ac078837c3b3530370 WatchSource:0}: Error finding container d827b6b460543f46c16c84f1f84af6b4b25d3b6699d513ac078837c3b3530370: Status 404 returned error can't find the container with id d827b6b460543f46c16c84f1f84af6b4b25d3b6699d513ac078837c3b3530370 Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.814133 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:35:47 crc kubenswrapper[4763]: W0930 13:35:47.830918 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e8106c59f5a07841fd5393a8277562ebe6170da4123d3b55d623ef1ce39fc6b3 WatchSource:0}: Error finding container e8106c59f5a07841fd5393a8277562ebe6170da4123d3b55d623ef1ce39fc6b3: Status 404 returned error can't find the container with id e8106c59f5a07841fd5393a8277562ebe6170da4123d3b55d623ef1ce39fc6b3 Sep 30 13:35:47 crc kubenswrapper[4763]: I0930 13:35:47.948867 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:47 crc kubenswrapper[4763]: E0930 13:35:47.949041 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:35:48.949024512 +0000 UTC m=+21.087584787 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.050312 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.050362 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.050388 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.050416 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050513 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050558 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050583 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050686 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050625 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050729 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050570 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:49.050552662 +0000 UTC m=+21.189112947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050707 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050794 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:49.050768187 +0000 UTC m=+21.189328472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050807 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050811 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:49.050804778 +0000 UTC m=+21.189365063 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.050846 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:49.050837639 +0000 UTC m=+21.189397924 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.498408 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.499301 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.500679 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.501445 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.502693 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.503326 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.504537 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.505520 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.506247 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.507223 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.507705 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.508904 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.510069 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.510963 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.512046 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.512534 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.513436 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.513822 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.514360 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.514873 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.515323 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.515835 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.516974 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.517378 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.518552 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.519070 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.519647 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.520744 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.521180 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.522163 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.522591 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.523484 4763 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.523581 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.525268 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.526369 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.526853 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.528566 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.528785 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.529250 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.530386 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.531000 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.532332 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.532904 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.534189 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.534986 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.536572 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.537116 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.538192 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.539032 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.540516 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.541277 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.542416 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.543018 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.544189 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.545099 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.545725 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.550780 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.564769 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.584648 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.601194 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.616798 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d827b6b460543f46c16c84f1f84af6b4b25d3b6699d513ac078837c3b3530370"} Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.618401 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0"} Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.618470 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489"} Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.618483 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c38189e7eb59a465efad8f9b23378f0f3065517110cb7db8d7860687c9551044"} Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.620304 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.627234 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1"} Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.627305 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.627694 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.629295 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0"} Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.629339 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e8106c59f5a07841fd5393a8277562ebe6170da4123d3b55d623ef1ce39fc6b3"} Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.642673 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.657758 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.673249 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.686960 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.700794 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.715357 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.734552 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.746707 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.760863 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:48 crc kubenswrapper[4763]: I0930 13:35:48.958704 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:48 crc kubenswrapper[4763]: E0930 13:35:48.959050 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:35:50.959006606 +0000 UTC m=+23.097566901 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:35:49 crc kubenswrapper[4763]: I0930 13:35:49.059746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:49 crc kubenswrapper[4763]: I0930 13:35:49.059821 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:49 crc kubenswrapper[4763]: I0930 13:35:49.059863 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:49 crc kubenswrapper[4763]: I0930 13:35:49.059894 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.060004 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.060087 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:51.060062217 +0000 UTC m=+23.198622502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.060750 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.060806 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:51.060795102 +0000 UTC m=+23.199355377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.060890 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.060917 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.060932 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.060967 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:51.060955686 +0000 UTC m=+23.199515971 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.061022 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.061035 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.061045 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.061073 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:51.061064708 +0000 UTC m=+23.199624993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:49 crc kubenswrapper[4763]: I0930 13:35:49.488757 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:49 crc kubenswrapper[4763]: I0930 13:35:49.488828 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:49 crc kubenswrapper[4763]: I0930 13:35:49.488893 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.488903 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.489075 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:35:49 crc kubenswrapper[4763]: E0930 13:35:49.489246 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:35:50 crc kubenswrapper[4763]: I0930 13:35:50.637913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e"} Sep 30 13:35:50 crc kubenswrapper[4763]: I0930 13:35:50.655274 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:50 crc kubenswrapper[4763]: I0930 13:35:50.667539 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:50 crc kubenswrapper[4763]: I0930 13:35:50.682638 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:50 crc kubenswrapper[4763]: I0930 13:35:50.699617 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:50 crc kubenswrapper[4763]: I0930 13:35:50.720467 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:50 crc kubenswrapper[4763]: I0930 13:35:50.741899 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:50 crc kubenswrapper[4763]: I0930 13:35:50.760415 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:50 crc kubenswrapper[4763]: I0930 13:35:50.774801 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:50 crc kubenswrapper[4763]: I0930 13:35:50.979823 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:50 crc kubenswrapper[4763]: E0930 13:35:50.980100 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:35:54.980050277 +0000 UTC m=+27.118610572 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:35:51 crc kubenswrapper[4763]: I0930 13:35:51.081397 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:51 crc kubenswrapper[4763]: I0930 13:35:51.081466 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:51 crc kubenswrapper[4763]: I0930 13:35:51.081495 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:51 crc kubenswrapper[4763]: I0930 13:35:51.081528 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081655 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081743 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:55.08171961 +0000 UTC m=+27.220279895 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081768 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081788 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081869 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081886 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081904 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081799 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081944 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:55.081903584 +0000 UTC m=+27.220463869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081957 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.081975 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:55.081965666 +0000 UTC m=+27.220525951 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.082013 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:55.081993676 +0000 UTC m=+27.220554181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:51 crc kubenswrapper[4763]: I0930 13:35:51.489054 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:51 crc kubenswrapper[4763]: I0930 13:35:51.489169 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:51 crc kubenswrapper[4763]: I0930 13:35:51.489080 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.489371 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.489468 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:35:51 crc kubenswrapper[4763]: E0930 13:35:51.489642 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.489394 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.489483 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:53 crc kubenswrapper[4763]: E0930 13:35:53.489558 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:35:53 crc kubenswrapper[4763]: E0930 13:35:53.489760 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.489862 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:53 crc kubenswrapper[4763]: E0930 13:35:53.489965 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.627543 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.630081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.630147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.630168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.630289 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.640366 4763 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.640526 4763 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.642341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.642416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.642442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.642466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.642482 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:53Z","lastTransitionTime":"2025-09-30T13:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:53 crc kubenswrapper[4763]: E0930 13:35:53.663392 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.668621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.668663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.668682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.668707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.668723 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:53Z","lastTransitionTime":"2025-09-30T13:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:53 crc kubenswrapper[4763]: E0930 13:35:53.684462 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.689671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.689743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.689764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.689794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.689814 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:53Z","lastTransitionTime":"2025-09-30T13:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:53 crc kubenswrapper[4763]: E0930 13:35:53.710236 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.715377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.715435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.715448 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.715468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.715481 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:53Z","lastTransitionTime":"2025-09-30T13:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:53 crc kubenswrapper[4763]: E0930 13:35:53.728488 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.733529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.733573 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.733585 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.733623 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.733636 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:53Z","lastTransitionTime":"2025-09-30T13:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:53 crc kubenswrapper[4763]: E0930 13:35:53.754279 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:53 crc kubenswrapper[4763]: E0930 13:35:53.754529 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.757799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.757839 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.757849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.757865 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.757876 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:53Z","lastTransitionTime":"2025-09-30T13:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.860876 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.860920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.860932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.860953 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.860967 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:53Z","lastTransitionTime":"2025-09-30T13:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.964027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.964086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.964096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.964114 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:53 crc kubenswrapper[4763]: I0930 13:35:53.964123 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:53Z","lastTransitionTime":"2025-09-30T13:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.067008 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.067058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.067073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.067098 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.067120 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:54Z","lastTransitionTime":"2025-09-30T13:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.169917 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.169955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.169966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.169984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.169994 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:54Z","lastTransitionTime":"2025-09-30T13:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.272465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.272530 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.272541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.272562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.272574 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:54Z","lastTransitionTime":"2025-09-30T13:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.375077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.375126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.375139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.375157 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.375170 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:54Z","lastTransitionTime":"2025-09-30T13:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.478412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.478464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.478477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.478497 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.478510 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:54Z","lastTransitionTime":"2025-09-30T13:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.515219 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-l26sn"] Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.515592 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l26sn" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.517853 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.518009 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.518913 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.548678 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.568248 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.581391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.581448 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.581461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.581483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.581496 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:54Z","lastTransitionTime":"2025-09-30T13:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.599676 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.616063 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhg2\" (UniqueName: \"kubernetes.io/projected/894b8880-d853-4f58-8be7-d5db22b85f6b-kube-api-access-dlhg2\") pod \"node-resolver-l26sn\" (UID: \"894b8880-d853-4f58-8be7-d5db22b85f6b\") " pod="openshift-dns/node-resolver-l26sn" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.616227 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/894b8880-d853-4f58-8be7-d5db22b85f6b-hosts-file\") pod \"node-resolver-l26sn\" (UID: \"894b8880-d853-4f58-8be7-d5db22b85f6b\") " pod="openshift-dns/node-resolver-l26sn" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.621992 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.659318 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.678688 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.683830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.683892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.683907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.683931 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.683944 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:54Z","lastTransitionTime":"2025-09-30T13:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.694740 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.709827 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.717360 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/894b8880-d853-4f58-8be7-d5db22b85f6b-hosts-file\") pod \"node-resolver-l26sn\" (UID: \"894b8880-d853-4f58-8be7-d5db22b85f6b\") " pod="openshift-dns/node-resolver-l26sn" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.717476 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhg2\" (UniqueName: \"kubernetes.io/projected/894b8880-d853-4f58-8be7-d5db22b85f6b-kube-api-access-dlhg2\") pod \"node-resolver-l26sn\" (UID: \"894b8880-d853-4f58-8be7-d5db22b85f6b\") " pod="openshift-dns/node-resolver-l26sn" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.717583 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/894b8880-d853-4f58-8be7-d5db22b85f6b-hosts-file\") pod \"node-resolver-l26sn\" (UID: \"894b8880-d853-4f58-8be7-d5db22b85f6b\") " pod="openshift-dns/node-resolver-l26sn" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.726088 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.751509 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhg2\" (UniqueName: \"kubernetes.io/projected/894b8880-d853-4f58-8be7-d5db22b85f6b-kube-api-access-dlhg2\") pod \"node-resolver-l26sn\" (UID: \"894b8880-d853-4f58-8be7-d5db22b85f6b\") " pod="openshift-dns/node-resolver-l26sn" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.787462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.787528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.787545 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.787566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.787580 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:54Z","lastTransitionTime":"2025-09-30T13:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.828103 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l26sn" Sep 30 13:35:54 crc kubenswrapper[4763]: W0930 13:35:54.844051 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod894b8880_d853_4f58_8be7_d5db22b85f6b.slice/crio-63ca6b6d9899de487a40ddd3bda459770d5ebdf00c2d7808e1a54d43fc8e422b WatchSource:0}: Error finding container 63ca6b6d9899de487a40ddd3bda459770d5ebdf00c2d7808e1a54d43fc8e422b: Status 404 returned error can't find the container with id 63ca6b6d9899de487a40ddd3bda459770d5ebdf00c2d7808e1a54d43fc8e422b Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.891421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.891482 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.891495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.891516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.891529 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:54Z","lastTransitionTime":"2025-09-30T13:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.947085 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-49jns"] Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.947529 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.952700 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.952845 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.952865 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.952918 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.952934 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5fjhf"] Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.955133 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.958966 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.959252 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.960504 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.960784 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.960968 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.963930 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5rtn6"] Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.966914 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.968981 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-c9qpw"] Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.969903 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.970823 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c9qpw" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.973080 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 13:35:54 crc kubenswrapper[4763]: W0930 13:35:54.977057 4763 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Sep 30 13:35:54 crc kubenswrapper[4763]: E0930 13:35:54.977122 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.977265 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.977627 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 13:35:54 crc kubenswrapper[4763]: W0930 13:35:54.977672 4763 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Sep 30 13:35:54 crc kubenswrapper[4763]: E0930 13:35:54.977698 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.977762 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.977952 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.977956 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.978141 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.979553 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.994153 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.994200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.994273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.994299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:54 crc kubenswrapper[4763]: I0930 13:35:54.994356 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:54Z","lastTransitionTime":"2025-09-30T13:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.005756 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.019083 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021117 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021253 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021296 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-env-overrides\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021344 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/766e1024-d943-4721-a366-83bc3635cc79-multus-daemon-config\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021379 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-slash\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021408 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-openvswitch\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021434 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-var-lib-cni-bin\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021461 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-systemd-units\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021491 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6prbc\" (UniqueName: \"kubernetes.io/projected/da518be6-b52d-4130-aab2-f27bfd4f9571-kube-api-access-6prbc\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021521 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da518be6-b52d-4130-aab2-f27bfd4f9571-ovn-node-metrics-cert\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021547 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-cnibin\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021576 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021623 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-run-multus-certs\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021653 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-node-log\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021679 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-ovn-kubernetes\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021706 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e3789557-abc5-4243-9049-4afe8717cdf9-rootfs\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021741 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10c96750-42ea-4ae1-b6ae-abd96e614336-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021771 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-run-netns\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021796 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-etc-kubernetes\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021821 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zb6p\" (UniqueName: \"kubernetes.io/projected/766e1024-d943-4721-a366-83bc3635cc79-kube-api-access-4zb6p\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021849 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-bin\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021874 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-script-lib\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021904 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-system-cni-dir\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.021930 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-run-k8s-cni-cncf-io\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.021963 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:36:03.021939674 +0000 UTC m=+35.160499959 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022009 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-netns\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022042 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-var-lib-openvswitch\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022125 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-log-socket\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022180 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-config\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022258 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-hostroot\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-ovn\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022347 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-etc-openvswitch\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022377 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtwqg\" (UniqueName: \"kubernetes.io/projected/e3789557-abc5-4243-9049-4afe8717cdf9-kube-api-access-mtwqg\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022429 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-cnibin\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022464 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-kubelet\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022492 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-os-release\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022520 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7blnm\" (UniqueName: \"kubernetes.io/projected/10c96750-42ea-4ae1-b6ae-abd96e614336-kube-api-access-7blnm\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022542 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/766e1024-d943-4721-a366-83bc3635cc79-cni-binary-copy\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022561 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-multus-conf-dir\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022579 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-systemd\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022613 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10c96750-42ea-4ae1-b6ae-abd96e614336-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022634 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3789557-abc5-4243-9049-4afe8717cdf9-mcd-auth-proxy-config\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022653 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-system-cni-dir\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022674 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-var-lib-kubelet\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022700 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-netd\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022719 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3789557-abc5-4243-9049-4afe8717cdf9-proxy-tls\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-multus-cni-dir\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-os-release\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022773 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-multus-socket-dir-parent\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.022793 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-var-lib-cni-multus\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.031085 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.043183 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.059390 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.071882 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.085402 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.097229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.097271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.097283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.097300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.097313 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:55Z","lastTransitionTime":"2025-09-30T13:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.106905 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124421 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-run-multus-certs\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124504 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-node-log\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-ovn-kubernetes\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124584 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e3789557-abc5-4243-9049-4afe8717cdf9-rootfs\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124646 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10c96750-42ea-4ae1-b6ae-abd96e614336-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124677 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-run-netns\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124725 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-etc-kubernetes\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124752 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zb6p\" (UniqueName: \"kubernetes.io/projected/766e1024-d943-4721-a366-83bc3635cc79-kube-api-access-4zb6p\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124799 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-bin\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124824 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-run-k8s-cni-cncf-io\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124870 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-netns\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124894 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-var-lib-openvswitch\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124944 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-log-socket\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.124975 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-config\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-script-lib\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-system-cni-dir\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125139 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-hostroot\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125189 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-ovn\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125222 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125274 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125309 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-cnibin\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125357 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-kubelet\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125384 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-etc-openvswitch\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125433 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtwqg\" (UniqueName: \"kubernetes.io/projected/e3789557-abc5-4243-9049-4afe8717cdf9-kube-api-access-mtwqg\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125464 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-os-release\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125515 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7blnm\" (UniqueName: \"kubernetes.io/projected/10c96750-42ea-4ae1-b6ae-abd96e614336-kube-api-access-7blnm\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125547 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/766e1024-d943-4721-a366-83bc3635cc79-cni-binary-copy\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125623 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-multus-conf-dir\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125657 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-systemd\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125705 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10c96750-42ea-4ae1-b6ae-abd96e614336-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-var-lib-kubelet\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125782 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-netd\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125812 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3789557-abc5-4243-9049-4afe8717cdf9-proxy-tls\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125841 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3789557-abc5-4243-9049-4afe8717cdf9-mcd-auth-proxy-config\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125893 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-system-cni-dir\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125946 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-multus-cni-dir\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.125977 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-os-release\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126028 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-multus-socket-dir-parent\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126057 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-var-lib-cni-multus\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126141 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/766e1024-d943-4721-a366-83bc3635cc79-multus-daemon-config\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126168 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-slash\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126220 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126248 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-env-overrides\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126281 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-openvswitch\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126310 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-systemd-units\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126356 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6prbc\" (UniqueName: \"kubernetes.io/projected/da518be6-b52d-4130-aab2-f27bfd4f9571-kube-api-access-6prbc\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-var-lib-cni-bin\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126409 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da518be6-b52d-4130-aab2-f27bfd4f9571-ovn-node-metrics-cert\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126437 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-cnibin\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126467 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126801 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-os-release\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126872 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-run-multus-certs\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126911 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-node-log\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126900 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-multus-socket-dir-parent\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.126934 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-var-lib-cni-bin\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127054 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-netd\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127146 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-var-lib-cni-multus\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-system-cni-dir\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127283 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e3789557-abc5-4243-9049-4afe8717cdf9-rootfs\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127247 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-ovn-kubernetes\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127287 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127322 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10c96750-42ea-4ae1-b6ae-abd96e614336-cnibin\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.127489 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.127547 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:03.127526185 +0000 UTC m=+35.266086630 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127867 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-env-overrides\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127908 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-openvswitch\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127921 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-system-cni-dir\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127938 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-slash\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127977 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127994 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-cnibin\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127979 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-systemd-units\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128016 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-hostroot\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128048 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-systemd\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.127981 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128075 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-ovn\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128064 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-etc-openvswitch\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.128094 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.127977 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/766e1024-d943-4721-a366-83bc3635cc79-cni-binary-copy\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.128117 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128157 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-multus-cni-dir\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.128175 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.128188 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.128199 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.128204 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:03.128180058 +0000 UTC m=+35.266740533 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128055 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-kubelet\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.128228 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:03.128220449 +0000 UTC m=+35.266780734 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128017 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-multus-conf-dir\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.128057 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128250 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-var-lib-kubelet\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128262 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10c96750-42ea-4ae1-b6ae-abd96e614336-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.128284 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:03.12827701 +0000 UTC m=+35.266837295 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128289 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-netns\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128302 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-run-netns\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128310 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-var-lib-openvswitch\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128318 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-bin\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128226 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-os-release\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128332 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-etc-kubernetes\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128339 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/766e1024-d943-4721-a366-83bc3635cc79-host-run-k8s-cni-cncf-io\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128354 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-log-socket\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128567 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3789557-abc5-4243-9049-4afe8717cdf9-mcd-auth-proxy-config\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128787 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10c96750-42ea-4ae1-b6ae-abd96e614336-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.128905 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-config\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.129478 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-script-lib\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.130195 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.136387 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3789557-abc5-4243-9049-4afe8717cdf9-proxy-tls\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.136453 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da518be6-b52d-4130-aab2-f27bfd4f9571-ovn-node-metrics-cert\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.156189 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7blnm\" (UniqueName: \"kubernetes.io/projected/10c96750-42ea-4ae1-b6ae-abd96e614336-kube-api-access-7blnm\") pod \"multus-additional-cni-plugins-5fjhf\" (UID: \"10c96750-42ea-4ae1-b6ae-abd96e614336\") " pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.165360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtwqg\" (UniqueName: \"kubernetes.io/projected/e3789557-abc5-4243-9049-4afe8717cdf9-kube-api-access-mtwqg\") pod \"machine-config-daemon-49jns\" (UID: \"e3789557-abc5-4243-9049-4afe8717cdf9\") " pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.167953 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.174989 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zb6p\" (UniqueName: \"kubernetes.io/projected/766e1024-d943-4721-a366-83bc3635cc79-kube-api-access-4zb6p\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.175713 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6prbc\" (UniqueName: \"kubernetes.io/projected/da518be6-b52d-4130-aab2-f27bfd4f9571-kube-api-access-6prbc\") pod \"ovnkube-node-5rtn6\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.198525 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.199784 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.199819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.199831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.199851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.199862 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:55Z","lastTransitionTime":"2025-09-30T13:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.236293 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.263681 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.274978 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.282959 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.286162 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" Sep 30 13:35:55 crc kubenswrapper[4763]: W0930 13:35:55.288398 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3789557_abc5_4243_9049_4afe8717cdf9.slice/crio-b2d67782bd51f59c625bd34abc7ea4095a97cf6e9bb29f92f038117fb42b3be7 WatchSource:0}: Error finding container b2d67782bd51f59c625bd34abc7ea4095a97cf6e9bb29f92f038117fb42b3be7: Status 404 returned error can't find the container with id b2d67782bd51f59c625bd34abc7ea4095a97cf6e9bb29f92f038117fb42b3be7 Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.292064 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:35:55 crc kubenswrapper[4763]: W0930 13:35:55.301867 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c96750_42ea_4ae1_b6ae_abd96e614336.slice/crio-8657c2d078aee838bb37c269a09f972bcbf3a6dc1145e848f9a2e2c2ebb1fab6 WatchSource:0}: Error finding container 8657c2d078aee838bb37c269a09f972bcbf3a6dc1145e848f9a2e2c2ebb1fab6: Status 404 returned error can't find the container with id 8657c2d078aee838bb37c269a09f972bcbf3a6dc1145e848f9a2e2c2ebb1fab6 Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.305154 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.306548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.306615 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.306630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.306648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.306661 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:55Z","lastTransitionTime":"2025-09-30T13:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:55 crc kubenswrapper[4763]: W0930 13:35:55.314985 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda518be6_b52d_4130_aab2_f27bfd4f9571.slice/crio-5333e1ff3aab79ac4c2fdc5dc93f3594e01c428fc9d4d8b30708ede2bf254cc7 WatchSource:0}: Error finding container 5333e1ff3aab79ac4c2fdc5dc93f3594e01c428fc9d4d8b30708ede2bf254cc7: Status 404 returned error can't find the container with id 5333e1ff3aab79ac4c2fdc5dc93f3594e01c428fc9d4d8b30708ede2bf254cc7 Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.325252 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.352575 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.373099 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.396694 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.409030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.409081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.409093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.409114 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.409127 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:55Z","lastTransitionTime":"2025-09-30T13:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.414486 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.431450 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.445271 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.488790 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.488845 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.489061 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.489189 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.489737 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:55 crc kubenswrapper[4763]: E0930 13:35:55.489815 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.512654 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.512757 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.512772 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.512820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.512837 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:55Z","lastTransitionTime":"2025-09-30T13:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.616413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.616463 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.616474 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.616495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.616509 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:55Z","lastTransitionTime":"2025-09-30T13:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.663549 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l26sn" event={"ID":"894b8880-d853-4f58-8be7-d5db22b85f6b","Type":"ContainerStarted","Data":"01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.663642 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l26sn" event={"ID":"894b8880-d853-4f58-8be7-d5db22b85f6b","Type":"ContainerStarted","Data":"63ca6b6d9899de487a40ddd3bda459770d5ebdf00c2d7808e1a54d43fc8e422b"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.666060 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" event={"ID":"10c96750-42ea-4ae1-b6ae-abd96e614336","Type":"ContainerStarted","Data":"5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.666154 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" event={"ID":"10c96750-42ea-4ae1-b6ae-abd96e614336","Type":"ContainerStarted","Data":"8657c2d078aee838bb37c269a09f972bcbf3a6dc1145e848f9a2e2c2ebb1fab6"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.668054 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11" exitCode=0 Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.668127 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.668155 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"5333e1ff3aab79ac4c2fdc5dc93f3594e01c428fc9d4d8b30708ede2bf254cc7"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.669906 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.669936 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.669950 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"b2d67782bd51f59c625bd34abc7ea4095a97cf6e9bb29f92f038117fb42b3be7"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.679957 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.697458 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.714401 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.718963 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.719005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.719016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.719036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.719052 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:55Z","lastTransitionTime":"2025-09-30T13:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.728885 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.746379 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.762309 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.777431 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.795536 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.809757 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.822810 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.823468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.823520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.823531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.823549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.823559 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:55Z","lastTransitionTime":"2025-09-30T13:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.845860 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.851301 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.859873 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.876338 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.899539 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.913998 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.926354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.926398 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.926408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.926424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.926435 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:55Z","lastTransitionTime":"2025-09-30T13:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.927273 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.944552 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.960343 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.973807 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:55 crc kubenswrapper[4763]: I0930 13:35:55.990431 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.004852 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.019296 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.030309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.030349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.030360 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.030376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.030384 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:56Z","lastTransitionTime":"2025-09-30T13:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.040608 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.055911 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.073660 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.087714 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: E0930 13:35:56.127919 4763 configmap.go:193] Couldn't get configMap openshift-multus/multus-daemon-config: failed to sync configmap cache: timed out waiting for the condition Sep 30 13:35:56 crc kubenswrapper[4763]: E0930 13:35:56.128084 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/766e1024-d943-4721-a366-83bc3635cc79-multus-daemon-config podName:766e1024-d943-4721-a366-83bc3635cc79 nodeName:}" failed. No retries permitted until 2025-09-30 13:35:56.628050422 +0000 UTC m=+28.766610727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "multus-daemon-config" (UniqueName: "kubernetes.io/configmap/766e1024-d943-4721-a366-83bc3635cc79-multus-daemon-config") pod "multus-c9qpw" (UID: "766e1024-d943-4721-a366-83bc3635cc79") : failed to sync configmap cache: timed out waiting for the condition Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.132762 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.132817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.132831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.132851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.132865 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:56Z","lastTransitionTime":"2025-09-30T13:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.235697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.235733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.235775 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.235801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.235811 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:56Z","lastTransitionTime":"2025-09-30T13:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.339863 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.339913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.339929 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.339947 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.339958 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:56Z","lastTransitionTime":"2025-09-30T13:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.442548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.443013 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.443025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.443047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.443065 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:56Z","lastTransitionTime":"2025-09-30T13:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.503776 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.545376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.545439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.545455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.545482 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.545516 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:56Z","lastTransitionTime":"2025-09-30T13:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.642334 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/766e1024-d943-4721-a366-83bc3635cc79-multus-daemon-config\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.643027 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/766e1024-d943-4721-a366-83bc3635cc79-multus-daemon-config\") pod \"multus-c9qpw\" (UID: \"766e1024-d943-4721-a366-83bc3635cc79\") " pod="openshift-multus/multus-c9qpw" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.649368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.649423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.649435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.649456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.649466 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:56Z","lastTransitionTime":"2025-09-30T13:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.675822 4763 generic.go:334] "Generic (PLEG): container finished" podID="10c96750-42ea-4ae1-b6ae-abd96e614336" containerID="5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2" exitCode=0 Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.675931 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" event={"ID":"10c96750-42ea-4ae1-b6ae-abd96e614336","Type":"ContainerDied","Data":"5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.679405 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.679444 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.679454 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.679466 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.679479 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.698013 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.715138 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.729007 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.747284 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.758767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.758797 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.758807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.758820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.758829 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:56Z","lastTransitionTime":"2025-09-30T13:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.760269 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.773393 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.789789 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.801575 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c9qpw" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.804057 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: W0930 13:35:56.829805 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod766e1024_d943_4721_a366_83bc3635cc79.slice/crio-76b200cbfe632282d19e099593c2c5bc28efe8ae5c79a2801815b3de772ac717 WatchSource:0}: Error finding container 76b200cbfe632282d19e099593c2c5bc28efe8ae5c79a2801815b3de772ac717: Status 404 returned error can't find the container with id 76b200cbfe632282d19e099593c2c5bc28efe8ae5c79a2801815b3de772ac717 Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.831869 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.846443 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.859422 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.863222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.863271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.863289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.863316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.863335 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:56Z","lastTransitionTime":"2025-09-30T13:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.875519 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.895132 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.970116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.970186 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.970210 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.970237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:56 crc kubenswrapper[4763]: I0930 13:35:56.970249 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:56Z","lastTransitionTime":"2025-09-30T13:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.073015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.073060 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.073070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.073087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.073101 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:57Z","lastTransitionTime":"2025-09-30T13:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.169336 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-prttr"] Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.170095 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.177148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.177220 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.177236 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.177260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.177279 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:57Z","lastTransitionTime":"2025-09-30T13:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.178822 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.178901 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.179158 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.179280 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.191406 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.210040 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.225376 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.242291 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.248640 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f3af8022-cedc-4a5e-90e7-7110e1716c14-serviceca\") pod \"node-ca-prttr\" (UID: \"f3af8022-cedc-4a5e-90e7-7110e1716c14\") " pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.248737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftjrs\" (UniqueName: \"kubernetes.io/projected/f3af8022-cedc-4a5e-90e7-7110e1716c14-kube-api-access-ftjrs\") pod \"node-ca-prttr\" (UID: \"f3af8022-cedc-4a5e-90e7-7110e1716c14\") " pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.248768 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3af8022-cedc-4a5e-90e7-7110e1716c14-host\") pod \"node-ca-prttr\" (UID: \"f3af8022-cedc-4a5e-90e7-7110e1716c14\") " pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.259277 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.272716 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.280327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.280362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.280374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.280393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.280404 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:57Z","lastTransitionTime":"2025-09-30T13:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.289878 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.306502 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.318634 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.329802 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.345122 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.349654 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3af8022-cedc-4a5e-90e7-7110e1716c14-host\") pod \"node-ca-prttr\" (UID: \"f3af8022-cedc-4a5e-90e7-7110e1716c14\") " pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.349702 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f3af8022-cedc-4a5e-90e7-7110e1716c14-serviceca\") pod \"node-ca-prttr\" (UID: \"f3af8022-cedc-4a5e-90e7-7110e1716c14\") " pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.349727 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftjrs\" (UniqueName: \"kubernetes.io/projected/f3af8022-cedc-4a5e-90e7-7110e1716c14-kube-api-access-ftjrs\") pod \"node-ca-prttr\" (UID: \"f3af8022-cedc-4a5e-90e7-7110e1716c14\") " pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.349774 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3af8022-cedc-4a5e-90e7-7110e1716c14-host\") pod \"node-ca-prttr\" (UID: \"f3af8022-cedc-4a5e-90e7-7110e1716c14\") " pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.351261 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f3af8022-cedc-4a5e-90e7-7110e1716c14-serviceca\") pod \"node-ca-prttr\" (UID: \"f3af8022-cedc-4a5e-90e7-7110e1716c14\") " pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.360820 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.368469 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftjrs\" (UniqueName: \"kubernetes.io/projected/f3af8022-cedc-4a5e-90e7-7110e1716c14-kube-api-access-ftjrs\") pod \"node-ca-prttr\" (UID: \"f3af8022-cedc-4a5e-90e7-7110e1716c14\") " pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.374138 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.382864 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.382892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.382903 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.382916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.382924 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:57Z","lastTransitionTime":"2025-09-30T13:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.417974 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.485219 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.485257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.485271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.485284 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.485300 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:57Z","lastTransitionTime":"2025-09-30T13:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.488511 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-prttr" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.488558 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.488671 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:57 crc kubenswrapper[4763]: E0930 13:35:57.488784 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.488805 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:57 crc kubenswrapper[4763]: E0930 13:35:57.488940 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:35:57 crc kubenswrapper[4763]: E0930 13:35:57.489055 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:35:57 crc kubenswrapper[4763]: W0930 13:35:57.500581 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3af8022_cedc_4a5e_90e7_7110e1716c14.slice/crio-7e068aec3f3fc167d3c48a9f0e47d4f6455b8180275932a71e8827fc1e0aa756 WatchSource:0}: Error finding container 7e068aec3f3fc167d3c48a9f0e47d4f6455b8180275932a71e8827fc1e0aa756: Status 404 returned error can't find the container with id 7e068aec3f3fc167d3c48a9f0e47d4f6455b8180275932a71e8827fc1e0aa756 Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.587554 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.587612 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.587625 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.587643 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.587655 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:57Z","lastTransitionTime":"2025-09-30T13:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.686879 4763 generic.go:334] "Generic (PLEG): container finished" podID="10c96750-42ea-4ae1-b6ae-abd96e614336" containerID="18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b" exitCode=0 Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.686947 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" event={"ID":"10c96750-42ea-4ae1-b6ae-abd96e614336","Type":"ContainerDied","Data":"18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.693309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.693356 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.693367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.693394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.693407 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:57Z","lastTransitionTime":"2025-09-30T13:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.693959 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-prttr" event={"ID":"f3af8022-cedc-4a5e-90e7-7110e1716c14","Type":"ContainerStarted","Data":"7e068aec3f3fc167d3c48a9f0e47d4f6455b8180275932a71e8827fc1e0aa756"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.704345 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9qpw" event={"ID":"766e1024-d943-4721-a366-83bc3635cc79","Type":"ContainerStarted","Data":"821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.704408 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9qpw" event={"ID":"766e1024-d943-4721-a366-83bc3635cc79","Type":"ContainerStarted","Data":"76b200cbfe632282d19e099593c2c5bc28efe8ae5c79a2801815b3de772ac717"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.710093 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.714372 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.729840 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.745927 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.775495 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.796070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.796132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.796148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.796178 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.796192 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:57Z","lastTransitionTime":"2025-09-30T13:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.796729 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.809715 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.828212 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.846802 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.863356 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.880064 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.899409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.899442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.899451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.899466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.899476 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:57Z","lastTransitionTime":"2025-09-30T13:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.903257 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.918294 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.934049 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.947970 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.963118 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.977674 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:57 crc kubenswrapper[4763]: I0930 13:35:57.995362 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.001764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.001814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.001825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.001842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.001854 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:58Z","lastTransitionTime":"2025-09-30T13:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.011174 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.027060 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.040656 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.052331 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.065106 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.085565 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.099853 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.104232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.104276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.104286 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.104302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.104312 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:58Z","lastTransitionTime":"2025-09-30T13:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.110783 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.122568 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.135617 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.147160 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.211284 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.211314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.211324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.211342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.211359 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:58Z","lastTransitionTime":"2025-09-30T13:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.313521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.313561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.313571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.313585 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.313624 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:58Z","lastTransitionTime":"2025-09-30T13:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.416502 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.416560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.416572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.416617 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.416638 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:58Z","lastTransitionTime":"2025-09-30T13:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.513022 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.519260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.519328 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.519357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.519385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.519405 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:58Z","lastTransitionTime":"2025-09-30T13:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.529121 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.543316 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.565812 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.581657 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.593890 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.610814 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.623055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.623095 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.623105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.623121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.623135 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:58Z","lastTransitionTime":"2025-09-30T13:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.623764 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.638804 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.654454 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.676014 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.692800 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.710082 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.721260 4763 generic.go:334] "Generic (PLEG): container finished" podID="10c96750-42ea-4ae1-b6ae-abd96e614336" containerID="3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438" exitCode=0 Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.721333 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" event={"ID":"10c96750-42ea-4ae1-b6ae-abd96e614336","Type":"ContainerDied","Data":"3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.726345 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-prttr" event={"ID":"f3af8022-cedc-4a5e-90e7-7110e1716c14","Type":"ContainerStarted","Data":"7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.728178 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.728396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.728588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.728757 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.728833 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:58Z","lastTransitionTime":"2025-09-30T13:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.730493 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.746664 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.761531 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.775436 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.786022 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.802576 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.815767 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.833302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.833359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.833371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.833391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.833403 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:58Z","lastTransitionTime":"2025-09-30T13:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.833439 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.849421 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.874543 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.890771 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.912145 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.927905 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.936109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.936430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.936515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.936632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.936718 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:58Z","lastTransitionTime":"2025-09-30T13:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.946745 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:58 crc kubenswrapper[4763]: I0930 13:35:58.961376 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.038946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.038990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.039000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.039017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.039029 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:59Z","lastTransitionTime":"2025-09-30T13:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.142034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.142085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.142105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.142126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.142140 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:59Z","lastTransitionTime":"2025-09-30T13:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.245888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.245934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.245947 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.245964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.245975 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:59Z","lastTransitionTime":"2025-09-30T13:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.348452 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.348486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.348498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.348511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.348520 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:59Z","lastTransitionTime":"2025-09-30T13:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.451239 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.451313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.451330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.451359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.451380 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:59Z","lastTransitionTime":"2025-09-30T13:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.488827 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.488827 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:35:59 crc kubenswrapper[4763]: E0930 13:35:59.489013 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.488854 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:35:59 crc kubenswrapper[4763]: E0930 13:35:59.489150 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:35:59 crc kubenswrapper[4763]: E0930 13:35:59.489183 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.555109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.555160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.555170 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.555190 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.555201 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:59Z","lastTransitionTime":"2025-09-30T13:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.659290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.659362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.659379 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.659407 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.659421 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:59Z","lastTransitionTime":"2025-09-30T13:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.735695 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.738123 4763 generic.go:334] "Generic (PLEG): container finished" podID="10c96750-42ea-4ae1-b6ae-abd96e614336" containerID="81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d" exitCode=0 Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.738185 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" event={"ID":"10c96750-42ea-4ae1-b6ae-abd96e614336","Type":"ContainerDied","Data":"81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.757049 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.762510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.762550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.762562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.762584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.762625 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:59Z","lastTransitionTime":"2025-09-30T13:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.772368 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.787678 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.804539 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.822169 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.841816 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.857378 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.864749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.864801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.864812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.864828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.864839 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:59Z","lastTransitionTime":"2025-09-30T13:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.871218 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.890562 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.902017 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.916037 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.931571 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.945519 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.963930 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:35:59Z is after 2025-08-24T17:21:41Z" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.967767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.967893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.967989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.968099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:35:59 crc kubenswrapper[4763]: I0930 13:35:59.968115 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:35:59Z","lastTransitionTime":"2025-09-30T13:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.070380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.070412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.070422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.070437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.070447 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:00Z","lastTransitionTime":"2025-09-30T13:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.173514 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.173553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.173568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.173582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.173616 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:00Z","lastTransitionTime":"2025-09-30T13:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.275744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.275782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.275790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.275802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.275812 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:00Z","lastTransitionTime":"2025-09-30T13:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.378405 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.378443 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.378457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.378472 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.378482 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:00Z","lastTransitionTime":"2025-09-30T13:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.480616 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.480649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.480664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.480678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.480707 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:00Z","lastTransitionTime":"2025-09-30T13:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.583148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.583181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.583193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.583208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.583217 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:00Z","lastTransitionTime":"2025-09-30T13:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.686044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.686372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.686461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.686546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.686636 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:00Z","lastTransitionTime":"2025-09-30T13:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.743859 4763 generic.go:334] "Generic (PLEG): container finished" podID="10c96750-42ea-4ae1-b6ae-abd96e614336" containerID="7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f" exitCode=0 Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.743903 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" event={"ID":"10c96750-42ea-4ae1-b6ae-abd96e614336","Type":"ContainerDied","Data":"7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.758650 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.771849 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.786482 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.789226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.789263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.789274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.789291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.789302 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:00Z","lastTransitionTime":"2025-09-30T13:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.801184 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.815090 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.824728 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.839275 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.851572 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.865763 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.879726 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.894396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.894453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.894467 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.894490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.894504 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:00Z","lastTransitionTime":"2025-09-30T13:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.901174 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.916120 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.933423 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.946273 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.997307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.997362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.997373 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.997396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:00 crc kubenswrapper[4763]: I0930 13:36:00.997408 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:00Z","lastTransitionTime":"2025-09-30T13:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.100396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.100816 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.100829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.100851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.100865 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:01Z","lastTransitionTime":"2025-09-30T13:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.205729 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.205814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.205824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.205843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.205853 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:01Z","lastTransitionTime":"2025-09-30T13:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.309758 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.309808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.309820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.309841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.309853 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:01Z","lastTransitionTime":"2025-09-30T13:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.412144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.412176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.412184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.412197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.412207 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:01Z","lastTransitionTime":"2025-09-30T13:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.489111 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.489184 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.489115 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:01 crc kubenswrapper[4763]: E0930 13:36:01.489260 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:01 crc kubenswrapper[4763]: E0930 13:36:01.489306 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:01 crc kubenswrapper[4763]: E0930 13:36:01.489351 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.515402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.515451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.515463 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.515480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.515494 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:01Z","lastTransitionTime":"2025-09-30T13:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.618289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.618368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.618406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.618436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.618478 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:01Z","lastTransitionTime":"2025-09-30T13:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.721440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.721500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.721513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.721533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.721549 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:01Z","lastTransitionTime":"2025-09-30T13:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.750805 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.751142 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.751171 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.757583 4763 generic.go:334] "Generic (PLEG): container finished" podID="10c96750-42ea-4ae1-b6ae-abd96e614336" containerID="b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c" exitCode=0 Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.757659 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" event={"ID":"10c96750-42ea-4ae1-b6ae-abd96e614336","Type":"ContainerDied","Data":"b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.767577 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.780983 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.780968 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.781073 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.795959 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.811778 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.824966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.825012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.825026 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.825044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.825057 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:01Z","lastTransitionTime":"2025-09-30T13:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.827001 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.842244 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.856944 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.870106 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.882417 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.898672 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.910660 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.923566 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.932419 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.932461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.932473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.932496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.932510 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:01Z","lastTransitionTime":"2025-09-30T13:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.938693 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.960489 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.976749 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:01 crc kubenswrapper[4763]: I0930 13:36:01.990510 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.006331 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.027248 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.035670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.035713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.035724 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.035744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.035756 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:02Z","lastTransitionTime":"2025-09-30T13:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.041753 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.056268 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.071007 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.084742 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.107802 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.141279 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.142620 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.142669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.142678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.142692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.142701 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:02Z","lastTransitionTime":"2025-09-30T13:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.169917 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.187581 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.203349 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.227280 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.245536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.245637 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.245656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.245681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.245695 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:02Z","lastTransitionTime":"2025-09-30T13:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.348272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.348333 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.348348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.348370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.348385 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:02Z","lastTransitionTime":"2025-09-30T13:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.451198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.451248 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.451261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.451280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.451291 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:02Z","lastTransitionTime":"2025-09-30T13:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.553591 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.553670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.553686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.553704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.553718 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:02Z","lastTransitionTime":"2025-09-30T13:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.656592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.656697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.656706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.656722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.656732 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:02Z","lastTransitionTime":"2025-09-30T13:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.760542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.760579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.760588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.760619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.760633 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:02Z","lastTransitionTime":"2025-09-30T13:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.764218 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" event={"ID":"10c96750-42ea-4ae1-b6ae-abd96e614336","Type":"ContainerStarted","Data":"ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87"} Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.764288 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.777048 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.789496 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.800118 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.815004 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.827570 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.839729 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.852411 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.863163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.863193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.863203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.863217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.863228 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:02Z","lastTransitionTime":"2025-09-30T13:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.864936 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.886206 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.898648 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.911998 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.930768 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.945814 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.963045 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.965610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.965652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.965662 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.965678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:02 crc kubenswrapper[4763]: I0930 13:36:02.965688 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:02Z","lastTransitionTime":"2025-09-30T13:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.068169 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.068241 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.068258 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.068285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.068302 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.109664 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.109975 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:36:19.109922511 +0000 UTC m=+51.248482806 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.170366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.170406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.170418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.170433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.170444 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.179009 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.204574 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.210959 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.211027 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.211194 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.211221 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211138 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211321 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211335 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211345 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211320 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:19.21130055 +0000 UTC m=+51.349860825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211143 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211448 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211381 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:19.211372571 +0000 UTC m=+51.349932846 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211459 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211468 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211482 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:19.211464253 +0000 UTC m=+51.350024538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.211496 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:19.211489763 +0000 UTC m=+51.350050048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.225900 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.240531 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.255066 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.267788 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.272502 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.272547 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.272558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.272578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.272591 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.280471 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.292897 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.310311 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.323729 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.338562 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.353748 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.369712 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.374619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.374655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.374666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.374682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.374695 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.383588 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.404370 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.476792 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.476831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.476841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.476856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.476866 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.489152 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.489172 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.489402 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.489502 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.489188 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.489639 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.579391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.579663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.579682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.579709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.579725 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.681473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.681510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.681520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.681535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.681544 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.767115 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.784830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.784879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.784890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.784908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.784918 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.887349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.887393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.887403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.887417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.887433 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.944562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.944642 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.944656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.944673 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.944685 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.959455 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.963116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.963172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.963182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.963202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.963219 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.979502 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.983827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.983899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.983915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.983940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:03 crc kubenswrapper[4763]: I0930 13:36:03.983953 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:03Z","lastTransitionTime":"2025-09-30T13:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:03 crc kubenswrapper[4763]: E0930 13:36:03.996007 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.000041 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.000099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.000110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.000132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.000146 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: E0930 13:36:04.014417 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.018198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.018237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.018248 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.018267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.018280 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: E0930 13:36:04.031082 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: E0930 13:36:04.031284 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.032987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.033043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.033056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.033075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.033086 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.135173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.135211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.135223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.135238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.135250 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.237962 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.238030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.238049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.238073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.238089 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.341187 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.341227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.341238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.341252 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.341261 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.443567 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.443623 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.443642 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.443662 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.443674 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.546366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.546408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.546422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.546439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.546450 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.648874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.648933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.648945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.648964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.648976 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.751303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.751341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.751351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.751368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.751381 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.772392 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/0.log" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.777491 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140" exitCode=1 Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.777534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.778196 4763 scope.go:117] "RemoveContainer" containerID="3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.792895 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.808315 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.821949 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.840739 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.852443 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.853703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.853751 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.853787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.853803 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.853813 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.861763 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.872963 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.885238 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.895927 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.906751 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.919385 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.931096 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.946013 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.956387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.956417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.956426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.956440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.956449 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:04Z","lastTransitionTime":"2025-09-30T13:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:04 crc kubenswrapper[4763]: I0930 13:36:04.965886 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.939985 6077 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940075 6077 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940131 6077 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.940361 6077 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940778 6077 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.941696 6077 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:03.941724 6077 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:36:03.941738 6077 factory.go:656] Stopping watch factory\\\\nI0930 13:36:03.941752 6077 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:03.941767 6077 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:04Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.059692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.059730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.059741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.059756 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.059766 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:05Z","lastTransitionTime":"2025-09-30T13:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.161857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.161891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.161901 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.161945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.161954 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:05Z","lastTransitionTime":"2025-09-30T13:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.264877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.265197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.265208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.265226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.265238 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:05Z","lastTransitionTime":"2025-09-30T13:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.367501 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.367749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.367762 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.367778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.367788 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:05Z","lastTransitionTime":"2025-09-30T13:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.470155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.470191 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.470201 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.470215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.470225 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:05Z","lastTransitionTime":"2025-09-30T13:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.488740 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.488750 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:05 crc kubenswrapper[4763]: E0930 13:36:05.489029 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:05 crc kubenswrapper[4763]: E0930 13:36:05.488896 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.488771 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:05 crc kubenswrapper[4763]: E0930 13:36:05.489114 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.572785 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.572809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.572817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.572832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.572843 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:05Z","lastTransitionTime":"2025-09-30T13:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.675715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.675759 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.675769 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.675784 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.675795 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:05Z","lastTransitionTime":"2025-09-30T13:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.778267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.778301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.778310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.778321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.778330 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:05Z","lastTransitionTime":"2025-09-30T13:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.781403 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/1.log" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.781876 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/0.log" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.784378 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933" exitCode=1 Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.784411 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933"} Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.784450 4763 scope.go:117] "RemoveContainer" containerID="3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.785552 4763 scope.go:117] "RemoveContainer" containerID="c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933" Sep 30 13:36:05 crc kubenswrapper[4763]: E0930 13:36:05.785770 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.801129 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.817308 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.838146 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.939985 6077 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940075 6077 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940131 6077 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.940361 6077 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940778 6077 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.941696 6077 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:03.941724 6077 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:36:03.941738 6077 factory.go:656] Stopping watch factory\\\\nI0930 13:36:03.941752 6077 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:03.941767 6077 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:05Z\\\",\\\"message\\\":\\\"94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563140 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:05.563151 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563126 6212 factory.go:656] Stopping watch factory\\\\nI0930 13:36:05.563212 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:05.563218 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563233 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:05.563295 6212 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.852360 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.864251 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.877040 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.880998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.881041 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.881052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.881071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.881082 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:05Z","lastTransitionTime":"2025-09-30T13:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.889583 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.904546 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.917616 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.929350 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.940051 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.949874 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.963881 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.973468 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:05Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.983469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.983514 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.983523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.983538 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:05 crc kubenswrapper[4763]: I0930 13:36:05.983549 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:05Z","lastTransitionTime":"2025-09-30T13:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.086261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.086302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.086310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.086324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.086332 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:06Z","lastTransitionTime":"2025-09-30T13:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.189153 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.189198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.189213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.189229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.189240 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:06Z","lastTransitionTime":"2025-09-30T13:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.291474 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.291516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.291528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.291542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.291553 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:06Z","lastTransitionTime":"2025-09-30T13:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.394005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.394045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.394057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.394075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.394087 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:06Z","lastTransitionTime":"2025-09-30T13:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.496613 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.496658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.496668 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.496681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.496689 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:06Z","lastTransitionTime":"2025-09-30T13:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.599334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.599362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.599370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.599384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.599392 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:06Z","lastTransitionTime":"2025-09-30T13:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.702059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.702168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.702178 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.702195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.702204 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:06Z","lastTransitionTime":"2025-09-30T13:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.789030 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/1.log" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.804306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.804349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.804361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.804374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.804386 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:06Z","lastTransitionTime":"2025-09-30T13:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.907413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.907465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.907481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.907504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:06 crc kubenswrapper[4763]: I0930 13:36:06.907519 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:06Z","lastTransitionTime":"2025-09-30T13:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.010284 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.010318 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.010330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.010346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.010356 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:07Z","lastTransitionTime":"2025-09-30T13:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.113212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.113254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.113273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.113291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.113303 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:07Z","lastTransitionTime":"2025-09-30T13:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.163297 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn"] Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.166049 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.168744 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.168743 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.182281 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.194895 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.206020 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.215584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.215637 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.215649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.215668 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.215680 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:07Z","lastTransitionTime":"2025-09-30T13:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.217678 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.230170 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.240837 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.253285 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.255245 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnq6r\" (UniqueName: \"kubernetes.io/projected/cc0ba969-357e-406f-bf02-4e01f260d447-kube-api-access-nnq6r\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.255274 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc0ba969-357e-406f-bf02-4e01f260d447-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.255293 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc0ba969-357e-406f-bf02-4e01f260d447-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.255313 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc0ba969-357e-406f-bf02-4e01f260d447-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.262780 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.282249 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.939985 6077 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940075 6077 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940131 6077 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.940361 6077 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940778 6077 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.941696 6077 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:03.941724 6077 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:36:03.941738 6077 factory.go:656] Stopping watch factory\\\\nI0930 13:36:03.941752 6077 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:03.941767 6077 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:05Z\\\",\\\"message\\\":\\\"94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563140 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:05.563151 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563126 6212 factory.go:656] Stopping watch factory\\\\nI0930 13:36:05.563212 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:05.563218 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563233 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:05.563295 6212 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.296858 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.311291 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.318018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.318050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.318059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.318074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.318084 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:07Z","lastTransitionTime":"2025-09-30T13:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.326518 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.339684 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.353811 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.356134 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnq6r\" (UniqueName: \"kubernetes.io/projected/cc0ba969-357e-406f-bf02-4e01f260d447-kube-api-access-nnq6r\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.356176 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc0ba969-357e-406f-bf02-4e01f260d447-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.356318 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc0ba969-357e-406f-bf02-4e01f260d447-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.356361 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc0ba969-357e-406f-bf02-4e01f260d447-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.357152 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc0ba969-357e-406f-bf02-4e01f260d447-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.358055 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc0ba969-357e-406f-bf02-4e01f260d447-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.367698 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.368387 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc0ba969-357e-406f-bf02-4e01f260d447-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.375199 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnq6r\" (UniqueName: \"kubernetes.io/projected/cc0ba969-357e-406f-bf02-4e01f260d447-kube-api-access-nnq6r\") pod \"ovnkube-control-plane-749d76644c-4dsgn\" (UID: \"cc0ba969-357e-406f-bf02-4e01f260d447\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.420652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.420696 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.420708 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.420724 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.420736 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:07Z","lastTransitionTime":"2025-09-30T13:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.484701 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.488876 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.488965 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.489036 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:07 crc kubenswrapper[4763]: E0930 13:36:07.489146 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:07 crc kubenswrapper[4763]: E0930 13:36:07.488992 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:07 crc kubenswrapper[4763]: E0930 13:36:07.489266 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:07 crc kubenswrapper[4763]: W0930 13:36:07.506015 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc0ba969_357e_406f_bf02_4e01f260d447.slice/crio-0039592ed9c00c0efdd9af6ca6f43902c2e5f46084db81a01ad9daefb72eed1b WatchSource:0}: Error finding container 0039592ed9c00c0efdd9af6ca6f43902c2e5f46084db81a01ad9daefb72eed1b: Status 404 returned error can't find the container with id 0039592ed9c00c0efdd9af6ca6f43902c2e5f46084db81a01ad9daefb72eed1b Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.525465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.525523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.525533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.525550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.525562 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:07Z","lastTransitionTime":"2025-09-30T13:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.627572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.627627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.627641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.627655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.627664 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:07Z","lastTransitionTime":"2025-09-30T13:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.730874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.730922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.730931 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.730948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.730959 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:07Z","lastTransitionTime":"2025-09-30T13:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.804855 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" event={"ID":"cc0ba969-357e-406f-bf02-4e01f260d447","Type":"ContainerStarted","Data":"0039592ed9c00c0efdd9af6ca6f43902c2e5f46084db81a01ad9daefb72eed1b"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.834902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.834952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.834963 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.834983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.834996 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:07Z","lastTransitionTime":"2025-09-30T13:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.937364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.937402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.937430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.937445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:07 crc kubenswrapper[4763]: I0930 13:36:07.937456 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:07Z","lastTransitionTime":"2025-09-30T13:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.040309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.040349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.040360 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.040374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.040384 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:08Z","lastTransitionTime":"2025-09-30T13:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.142493 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.142539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.142549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.142562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.142571 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:08Z","lastTransitionTime":"2025-09-30T13:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.245225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.245276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.245287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.245305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.245317 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:08Z","lastTransitionTime":"2025-09-30T13:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.347706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.347750 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.347762 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.347779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.347791 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:08Z","lastTransitionTime":"2025-09-30T13:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.450515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.450553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.450562 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.450576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.450586 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:08Z","lastTransitionTime":"2025-09-30T13:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.505683 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.516995 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.529363 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.540684 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.552760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.552801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.552815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.552830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.552841 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:08Z","lastTransitionTime":"2025-09-30T13:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.552877 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.563394 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.579092 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.590064 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.605049 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.619963 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.623116 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-rggrv"] Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.623762 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:08 crc kubenswrapper[4763]: E0930 13:36:08.624224 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.641348 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.939985 6077 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940075 6077 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940131 6077 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.940361 6077 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940778 6077 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.941696 6077 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:03.941724 6077 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:36:03.941738 6077 factory.go:656] Stopping watch factory\\\\nI0930 13:36:03.941752 6077 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:03.941767 6077 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:05Z\\\",\\\"message\\\":\\\"94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563140 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:05.563151 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563126 6212 factory.go:656] Stopping watch factory\\\\nI0930 13:36:05.563212 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:05.563218 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563233 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:05.563295 6212 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.654630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.654669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.654679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.654696 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.654708 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:08Z","lastTransitionTime":"2025-09-30T13:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.654650 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.667715 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.683266 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.697375 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.710697 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.721831 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.733181 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.745729 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.757209 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.757831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.757871 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.757882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.757902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.757913 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:08Z","lastTransitionTime":"2025-09-30T13:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.766504 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.768838 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.768907 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vdcn\" (UniqueName: \"kubernetes.io/projected/394a12b5-37c3-4933-af17-71f5c84ec2fa-kube-api-access-4vdcn\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.780261 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.791168 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.804063 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.809465 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" event={"ID":"cc0ba969-357e-406f-bf02-4e01f260d447","Type":"ContainerStarted","Data":"470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.809514 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" event={"ID":"cc0ba969-357e-406f-bf02-4e01f260d447","Type":"ContainerStarted","Data":"7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.816826 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.836114 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.939985 6077 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940075 6077 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940131 6077 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.940361 6077 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940778 6077 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.941696 6077 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:03.941724 6077 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:36:03.941738 6077 factory.go:656] Stopping watch factory\\\\nI0930 13:36:03.941752 6077 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:03.941767 6077 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:05Z\\\",\\\"message\\\":\\\"94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563140 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:05.563151 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563126 6212 factory.go:656] Stopping watch factory\\\\nI0930 13:36:05.563212 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:05.563218 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563233 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:05.563295 6212 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.847632 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.860630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.860666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.860675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.860688 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.860697 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:08Z","lastTransitionTime":"2025-09-30T13:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.860714 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.869299 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.869393 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vdcn\" (UniqueName: \"kubernetes.io/projected/394a12b5-37c3-4933-af17-71f5c84ec2fa-kube-api-access-4vdcn\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:08 crc kubenswrapper[4763]: E0930 13:36:08.869461 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:08 crc kubenswrapper[4763]: E0930 13:36:08.869574 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs podName:394a12b5-37c3-4933-af17-71f5c84ec2fa nodeName:}" failed. No retries permitted until 2025-09-30 13:36:09.369548811 +0000 UTC m=+41.508109166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs") pod "network-metrics-daemon-rggrv" (UID: "394a12b5-37c3-4933-af17-71f5c84ec2fa") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.874189 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.885454 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vdcn\" (UniqueName: \"kubernetes.io/projected/394a12b5-37c3-4933-af17-71f5c84ec2fa-kube-api-access-4vdcn\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.886761 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.896623 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.909971 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.922011 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.931517 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.944679 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.954883 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.963929 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.963973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.963984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.964001 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.964016 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:08Z","lastTransitionTime":"2025-09-30T13:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.967379 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.979653 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:08 crc kubenswrapper[4763]: I0930 13:36:08.994512 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:08Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.006425 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.025786 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.939985 6077 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940075 6077 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940131 6077 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.940361 6077 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940778 6077 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.941696 6077 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:03.941724 6077 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:36:03.941738 6077 factory.go:656] Stopping watch factory\\\\nI0930 13:36:03.941752 6077 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:03.941767 6077 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:05Z\\\",\\\"message\\\":\\\"94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563140 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:05.563151 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563126 6212 factory.go:656] Stopping watch factory\\\\nI0930 13:36:05.563212 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:05.563218 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563233 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:05.563295 6212 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.037203 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.050626 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.063891 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.067176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.067220 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.067229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.067247 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.067256 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:09Z","lastTransitionTime":"2025-09-30T13:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.076493 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.087377 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.098508 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.169808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.169845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.169855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.169871 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.169881 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:09Z","lastTransitionTime":"2025-09-30T13:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.272054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.272414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.272487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.272575 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.272692 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:09Z","lastTransitionTime":"2025-09-30T13:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.373432 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:09 crc kubenswrapper[4763]: E0930 13:36:09.373709 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:09 crc kubenswrapper[4763]: E0930 13:36:09.373875 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs podName:394a12b5-37c3-4933-af17-71f5c84ec2fa nodeName:}" failed. No retries permitted until 2025-09-30 13:36:10.373791459 +0000 UTC m=+42.512351784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs") pod "network-metrics-daemon-rggrv" (UID: "394a12b5-37c3-4933-af17-71f5c84ec2fa") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.375377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.375517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.375628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.375701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.375770 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:09Z","lastTransitionTime":"2025-09-30T13:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.478905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.478960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.478976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.478998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.479012 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:09Z","lastTransitionTime":"2025-09-30T13:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.488708 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.488736 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:09 crc kubenswrapper[4763]: E0930 13:36:09.488891 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.489378 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:09 crc kubenswrapper[4763]: E0930 13:36:09.489493 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:09 crc kubenswrapper[4763]: E0930 13:36:09.489582 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.581188 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.581224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.581233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.581245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.581255 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:09Z","lastTransitionTime":"2025-09-30T13:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.683034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.683065 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.683074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.683088 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.683096 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:09Z","lastTransitionTime":"2025-09-30T13:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.785351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.785395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.785404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.785436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.785447 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:09Z","lastTransitionTime":"2025-09-30T13:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.887674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.887704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.887716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.887731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.887745 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:09Z","lastTransitionTime":"2025-09-30T13:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.989542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.989588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.989611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.989627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:09 crc kubenswrapper[4763]: I0930 13:36:09.989637 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:09Z","lastTransitionTime":"2025-09-30T13:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.091891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.091935 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.091949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.091967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.091977 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:10Z","lastTransitionTime":"2025-09-30T13:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.194394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.194445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.194458 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.194475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.194490 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:10Z","lastTransitionTime":"2025-09-30T13:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.297320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.297366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.297377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.297393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.297403 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:10Z","lastTransitionTime":"2025-09-30T13:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.384936 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:10 crc kubenswrapper[4763]: E0930 13:36:10.385120 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:10 crc kubenswrapper[4763]: E0930 13:36:10.385232 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs podName:394a12b5-37c3-4933-af17-71f5c84ec2fa nodeName:}" failed. No retries permitted until 2025-09-30 13:36:12.385207381 +0000 UTC m=+44.523767736 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs") pod "network-metrics-daemon-rggrv" (UID: "394a12b5-37c3-4933-af17-71f5c84ec2fa") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.400115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.400162 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.400172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.400187 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.400196 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:10Z","lastTransitionTime":"2025-09-30T13:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.488922 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:10 crc kubenswrapper[4763]: E0930 13:36:10.489384 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.502489 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.502527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.502537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.502553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.502563 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:10Z","lastTransitionTime":"2025-09-30T13:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.605486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.605534 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.605546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.605564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.605575 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:10Z","lastTransitionTime":"2025-09-30T13:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.709026 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.709079 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.709094 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.709113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.709130 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:10Z","lastTransitionTime":"2025-09-30T13:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.812545 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.812649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.812670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.812710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.812729 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:10Z","lastTransitionTime":"2025-09-30T13:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.916778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.916828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.916840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.916855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:10 crc kubenswrapper[4763]: I0930 13:36:10.916866 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:10Z","lastTransitionTime":"2025-09-30T13:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.019556 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.019620 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.019638 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.019663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.019677 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:11Z","lastTransitionTime":"2025-09-30T13:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.122167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.122198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.122206 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.122219 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.122228 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:11Z","lastTransitionTime":"2025-09-30T13:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.224469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.224505 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.224513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.224526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.224538 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:11Z","lastTransitionTime":"2025-09-30T13:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.327784 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.327834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.327845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.327869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.327880 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:11Z","lastTransitionTime":"2025-09-30T13:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.431060 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.431111 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.431121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.431138 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.431152 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:11Z","lastTransitionTime":"2025-09-30T13:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.488842 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.488906 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.489101 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:11 crc kubenswrapper[4763]: E0930 13:36:11.489094 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:11 crc kubenswrapper[4763]: E0930 13:36:11.489200 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:11 crc kubenswrapper[4763]: E0930 13:36:11.489273 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.533263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.533307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.533318 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.533335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.533346 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:11Z","lastTransitionTime":"2025-09-30T13:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.636638 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.636689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.636698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.636716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.636726 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:11Z","lastTransitionTime":"2025-09-30T13:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.744737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.744834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.744856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.744886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.744914 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:11Z","lastTransitionTime":"2025-09-30T13:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.847404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.847456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.847470 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.847487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.847499 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:11Z","lastTransitionTime":"2025-09-30T13:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.950619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.950672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.950690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.950707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:11 crc kubenswrapper[4763]: I0930 13:36:11.950718 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:11Z","lastTransitionTime":"2025-09-30T13:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.053652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.053707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.053720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.053736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.053747 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:12Z","lastTransitionTime":"2025-09-30T13:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.156414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.156464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.156477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.156493 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.156503 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:12Z","lastTransitionTime":"2025-09-30T13:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.259675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.259713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.259731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.259747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.259758 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:12Z","lastTransitionTime":"2025-09-30T13:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.362675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.362723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.362738 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.362755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.362765 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:12Z","lastTransitionTime":"2025-09-30T13:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.405443 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:12 crc kubenswrapper[4763]: E0930 13:36:12.405610 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:12 crc kubenswrapper[4763]: E0930 13:36:12.405699 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs podName:394a12b5-37c3-4933-af17-71f5c84ec2fa nodeName:}" failed. No retries permitted until 2025-09-30 13:36:16.40568183 +0000 UTC m=+48.544242115 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs") pod "network-metrics-daemon-rggrv" (UID: "394a12b5-37c3-4933-af17-71f5c84ec2fa") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.464962 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.465001 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.465010 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.465024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.465032 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:12Z","lastTransitionTime":"2025-09-30T13:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.489261 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:12 crc kubenswrapper[4763]: E0930 13:36:12.489373 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.566771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.566837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.566846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.566860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.566869 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:12Z","lastTransitionTime":"2025-09-30T13:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.668880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.668920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.668930 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.668945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.668957 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:12Z","lastTransitionTime":"2025-09-30T13:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.771409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.771468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.771484 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.771501 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.771512 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:12Z","lastTransitionTime":"2025-09-30T13:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.874076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.874110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.874125 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.874147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.874160 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:12Z","lastTransitionTime":"2025-09-30T13:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.976504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.976556 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.976568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.976580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:12 crc kubenswrapper[4763]: I0930 13:36:12.976589 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:12Z","lastTransitionTime":"2025-09-30T13:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.078640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.078682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.078693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.078707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.078717 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:13Z","lastTransitionTime":"2025-09-30T13:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.181068 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.181105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.181115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.181131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.181140 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:13Z","lastTransitionTime":"2025-09-30T13:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.284061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.284126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.284142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.284165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.284178 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:13Z","lastTransitionTime":"2025-09-30T13:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.387276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.387329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.387342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.387365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.387382 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:13Z","lastTransitionTime":"2025-09-30T13:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.488351 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.488357 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:13 crc kubenswrapper[4763]: E0930 13:36:13.488504 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:13 crc kubenswrapper[4763]: E0930 13:36:13.488644 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.488362 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:13 crc kubenswrapper[4763]: E0930 13:36:13.488722 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.489941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.489977 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.489990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.490009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.490026 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:13Z","lastTransitionTime":"2025-09-30T13:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.592568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.592624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.592633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.592647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.592658 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:13Z","lastTransitionTime":"2025-09-30T13:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.695481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.695518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.695528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.695544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.695560 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:13Z","lastTransitionTime":"2025-09-30T13:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.797809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.797847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.797857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.797871 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.797885 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:13Z","lastTransitionTime":"2025-09-30T13:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.901193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.901240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.901253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.901275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:13 crc kubenswrapper[4763]: I0930 13:36:13.901293 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:13Z","lastTransitionTime":"2025-09-30T13:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.004307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.004533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.004549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.004591 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.004646 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.058884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.058991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.061259 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.061520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.061559 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: E0930 13:36:14.077925 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:14Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.082545 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.082567 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.082577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.082805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.082819 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: E0930 13:36:14.096490 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:14Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.100169 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.100270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.100331 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.100396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.100456 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: E0930 13:36:14.112800 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:14Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.115956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.116025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.116037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.116055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.116068 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: E0930 13:36:14.126795 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:14Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.130797 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.130835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.130846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.130863 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.130874 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: E0930 13:36:14.143333 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:14Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:14 crc kubenswrapper[4763]: E0930 13:36:14.143507 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.145265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.145361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.145375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.145584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.145627 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.248064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.248100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.248148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.248164 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.248173 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.350967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.351020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.351036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.351052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.351062 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.453967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.454007 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.454048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.454063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.454072 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.489294 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:14 crc kubenswrapper[4763]: E0930 13:36:14.489459 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.556820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.556862 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.556875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.556891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.556901 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.659085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.659128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.659138 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.659154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.659163 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.762071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.762139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.762149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.762166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.762177 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.864445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.864482 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.864491 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.864508 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.864517 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.967152 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.967207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.967219 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.967235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:14 crc kubenswrapper[4763]: I0930 13:36:14.967244 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:14Z","lastTransitionTime":"2025-09-30T13:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.070086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.070134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.070151 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.070169 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.070186 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:15Z","lastTransitionTime":"2025-09-30T13:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.172482 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.172517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.172525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.172539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.172548 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:15Z","lastTransitionTime":"2025-09-30T13:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.274855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.274903 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.274936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.274954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.274966 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:15Z","lastTransitionTime":"2025-09-30T13:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.377282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.377323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.377333 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.377348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.377357 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:15Z","lastTransitionTime":"2025-09-30T13:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.479541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.479582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.479610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.479627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.479639 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:15Z","lastTransitionTime":"2025-09-30T13:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.488715 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.488761 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.488775 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:15 crc kubenswrapper[4763]: E0930 13:36:15.488832 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:15 crc kubenswrapper[4763]: E0930 13:36:15.488989 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:15 crc kubenswrapper[4763]: E0930 13:36:15.489048 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.581509 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.581550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.581578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.581592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.581624 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:15Z","lastTransitionTime":"2025-09-30T13:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.683679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.683714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.683723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.683739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.683748 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:15Z","lastTransitionTime":"2025-09-30T13:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.785462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.785522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.785532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.785548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.785557 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:15Z","lastTransitionTime":"2025-09-30T13:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.887712 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.887749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.887805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.887826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.887836 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:15Z","lastTransitionTime":"2025-09-30T13:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.991255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.991310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.991343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.991369 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:15 crc kubenswrapper[4763]: I0930 13:36:15.991383 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:15Z","lastTransitionTime":"2025-09-30T13:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.093735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.093782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.093793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.093811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.093822 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:16Z","lastTransitionTime":"2025-09-30T13:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.196055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.196133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.196145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.196432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.196447 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:16Z","lastTransitionTime":"2025-09-30T13:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.298254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.298292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.298301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.298317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.298329 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:16Z","lastTransitionTime":"2025-09-30T13:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.400480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.400516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.400526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.400541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.400553 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:16Z","lastTransitionTime":"2025-09-30T13:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.450784 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:16 crc kubenswrapper[4763]: E0930 13:36:16.451003 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:16 crc kubenswrapper[4763]: E0930 13:36:16.451061 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs podName:394a12b5-37c3-4933-af17-71f5c84ec2fa nodeName:}" failed. No retries permitted until 2025-09-30 13:36:24.451045597 +0000 UTC m=+56.589605882 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs") pod "network-metrics-daemon-rggrv" (UID: "394a12b5-37c3-4933-af17-71f5c84ec2fa") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.489096 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:16 crc kubenswrapper[4763]: E0930 13:36:16.489248 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.503065 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.503102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.503112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.503127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.503137 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:16Z","lastTransitionTime":"2025-09-30T13:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.605701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.605747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.605760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.605775 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.605784 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:16Z","lastTransitionTime":"2025-09-30T13:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.708445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.708852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.708940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.709025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.709110 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:16Z","lastTransitionTime":"2025-09-30T13:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.812431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.812496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.812515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.812540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.812557 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:16Z","lastTransitionTime":"2025-09-30T13:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.916109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.916189 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.916209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.916236 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:16 crc kubenswrapper[4763]: I0930 13:36:16.916256 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:16Z","lastTransitionTime":"2025-09-30T13:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.018778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.018822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.018832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.018848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.018860 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:17Z","lastTransitionTime":"2025-09-30T13:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.121644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.122419 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.122499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.122577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.122659 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:17Z","lastTransitionTime":"2025-09-30T13:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.225134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.225173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.225185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.225202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.225214 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:17Z","lastTransitionTime":"2025-09-30T13:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.327853 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.327898 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.327907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.327922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.327932 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:17Z","lastTransitionTime":"2025-09-30T13:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.431185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.431258 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.431272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.431297 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.431313 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:17Z","lastTransitionTime":"2025-09-30T13:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.489167 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.489271 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.489974 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:17 crc kubenswrapper[4763]: E0930 13:36:17.490116 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:17 crc kubenswrapper[4763]: E0930 13:36:17.490284 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:17 crc kubenswrapper[4763]: E0930 13:36:17.490365 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.534768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.534817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.534830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.534845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.534857 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:17Z","lastTransitionTime":"2025-09-30T13:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.637548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.637612 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.637626 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.637644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.637658 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:17Z","lastTransitionTime":"2025-09-30T13:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.741260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.741322 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.741337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.741355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.741367 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:17Z","lastTransitionTime":"2025-09-30T13:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.843149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.843243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.843269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.843302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.843326 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:17Z","lastTransitionTime":"2025-09-30T13:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.946587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.946713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.946738 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.946775 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:17 crc kubenswrapper[4763]: I0930 13:36:17.946800 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:17Z","lastTransitionTime":"2025-09-30T13:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.050794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.050849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.050866 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.050892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.050911 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:18Z","lastTransitionTime":"2025-09-30T13:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.154384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.154580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.154681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.154711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.154772 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:18Z","lastTransitionTime":"2025-09-30T13:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.258416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.258472 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.258489 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.258516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.258535 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:18Z","lastTransitionTime":"2025-09-30T13:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.362070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.362132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.362143 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.362165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.362176 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:18Z","lastTransitionTime":"2025-09-30T13:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.465537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.465657 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.465722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.465756 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.465775 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:18Z","lastTransitionTime":"2025-09-30T13:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.489387 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:18 crc kubenswrapper[4763]: E0930 13:36:18.489700 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.511001 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.527500 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.549486 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b204576720b2abcfeeab02883b02ac72079cb300913d0c2d3575f3bd4629140\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.939985 6077 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940075 6077 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940131 6077 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.940361 6077 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 13:36:03.940778 6077 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:36:03.941696 6077 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:03.941724 6077 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:36:03.941738 6077 factory.go:656] Stopping watch factory\\\\nI0930 13:36:03.941752 6077 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:03.941767 6077 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:05Z\\\",\\\"message\\\":\\\"94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563140 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:05.563151 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563126 6212 factory.go:656] Stopping watch factory\\\\nI0930 13:36:05.563212 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:05.563218 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563233 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:05.563295 6212 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.567295 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.568015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.568087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.568113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.568148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.568176 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:18Z","lastTransitionTime":"2025-09-30T13:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.582659 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.602264 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.620147 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.636701 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.659342 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.671630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.671691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.671712 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.671742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.671763 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:18Z","lastTransitionTime":"2025-09-30T13:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.675828 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.697494 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.715065 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.730216 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.743079 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.758108 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.767895 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.774439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.774485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.774498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.774517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.774532 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:18Z","lastTransitionTime":"2025-09-30T13:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.877804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.877885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.877907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.877940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.877960 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:18Z","lastTransitionTime":"2025-09-30T13:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.981740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.981803 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.981817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.981838 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:18 crc kubenswrapper[4763]: I0930 13:36:18.981848 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:18Z","lastTransitionTime":"2025-09-30T13:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.085706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.085779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.085798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.085829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.085849 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:19Z","lastTransitionTime":"2025-09-30T13:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.185965 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.186180 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:36:51.186145161 +0000 UTC m=+83.324705446 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.194011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.194049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.194059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.194073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.194082 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:19Z","lastTransitionTime":"2025-09-30T13:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.287808 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.287912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.287984 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288035 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288072 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288084 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.288090 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288144 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:51.288122001 +0000 UTC m=+83.426682286 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288254 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288322 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288380 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288396 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:51.288354585 +0000 UTC m=+83.426915000 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288410 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288550 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:51.288518019 +0000 UTC m=+83.427078464 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288835 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.288921 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:36:51.288897576 +0000 UTC m=+83.427458061 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.299241 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.299279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.299292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.299314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.299329 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:19Z","lastTransitionTime":"2025-09-30T13:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.402782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.402866 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.402894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.402929 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.402960 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:19Z","lastTransitionTime":"2025-09-30T13:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.489185 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.489262 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.489262 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.489443 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.489687 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:19 crc kubenswrapper[4763]: E0930 13:36:19.489877 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.509980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.510038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.510057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.510086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.510108 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:19Z","lastTransitionTime":"2025-09-30T13:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.613270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.613317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.613330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.613347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.613361 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:19Z","lastTransitionTime":"2025-09-30T13:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.717032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.717086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.717100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.717123 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.717139 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:19Z","lastTransitionTime":"2025-09-30T13:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.820405 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.820450 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.820465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.820483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.820497 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:19Z","lastTransitionTime":"2025-09-30T13:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.923458 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.923548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.923572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.923650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:19 crc kubenswrapper[4763]: I0930 13:36:19.923687 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:19Z","lastTransitionTime":"2025-09-30T13:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.027137 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.027200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.027215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.027240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.027266 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:20Z","lastTransitionTime":"2025-09-30T13:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.130179 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.130248 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.130267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.130296 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.130317 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:20Z","lastTransitionTime":"2025-09-30T13:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.233723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.233786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.233813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.233840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.233861 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:20Z","lastTransitionTime":"2025-09-30T13:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.337792 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.337878 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.337916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.337952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.337981 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:20Z","lastTransitionTime":"2025-09-30T13:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.441417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.441498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.441518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.441544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.441565 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:20Z","lastTransitionTime":"2025-09-30T13:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.489199 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:20 crc kubenswrapper[4763]: E0930 13:36:20.489438 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.490947 4763 scope.go:117] "RemoveContainer" containerID="c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.509366 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.529324 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.546013 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.546089 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.546105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.546126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.546163 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:20Z","lastTransitionTime":"2025-09-30T13:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.549747 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.572087 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:05Z\\\",\\\"message\\\":\\\"94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563140 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:05.563151 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563126 6212 factory.go:656] Stopping watch factory\\\\nI0930 13:36:05.563212 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:05.563218 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563233 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:05.563295 6212 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.593997 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.613229 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.629958 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.646219 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.648137 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.648183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.648194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.648207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.648216 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:20Z","lastTransitionTime":"2025-09-30T13:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.665270 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.691394 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.703809 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.718484 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.732444 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.751519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.751564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.751582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.751617 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.751631 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:20Z","lastTransitionTime":"2025-09-30T13:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.759116 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.779385 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.795734 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.850118 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/1.log" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.852880 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.853051 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.853147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.853212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.853227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.853243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.853255 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:20Z","lastTransitionTime":"2025-09-30T13:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.870782 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.896510 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.913870 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.937053 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.955836 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.956055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.956106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.956115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.956134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.956144 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:20Z","lastTransitionTime":"2025-09-30T13:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.973894 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:20 crc kubenswrapper[4763]: I0930 13:36:20.986037 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.004159 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:05Z\\\",\\\"message\\\":\\\"94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563140 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:05.563151 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563126 6212 factory.go:656] Stopping watch factory\\\\nI0930 13:36:05.563212 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:05.563218 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563233 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:05.563295 6212 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.015626 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.029428 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.042402 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.056530 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.058225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.058265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.058281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.058303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.058317 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:21Z","lastTransitionTime":"2025-09-30T13:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.072150 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.086779 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.100892 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.117204 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.161244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.161294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.161304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.161323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.161332 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:21Z","lastTransitionTime":"2025-09-30T13:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.271069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.271120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.271131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.271151 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.271162 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:21Z","lastTransitionTime":"2025-09-30T13:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.374502 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.374549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.374560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.374579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.374589 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:21Z","lastTransitionTime":"2025-09-30T13:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.477951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.478016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.478035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.478060 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.478077 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:21Z","lastTransitionTime":"2025-09-30T13:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.488649 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.488651 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:21 crc kubenswrapper[4763]: E0930 13:36:21.488921 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.488655 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:21 crc kubenswrapper[4763]: E0930 13:36:21.488806 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:21 crc kubenswrapper[4763]: E0930 13:36:21.489204 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.580884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.580928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.580938 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.580955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.580964 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:21Z","lastTransitionTime":"2025-09-30T13:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.683348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.683401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.683413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.683430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.683444 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:21Z","lastTransitionTime":"2025-09-30T13:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.786168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.786210 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.786221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.786236 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.786247 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:21Z","lastTransitionTime":"2025-09-30T13:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.859205 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/2.log" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.863098 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/1.log" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.867428 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f" exitCode=1 Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.867490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f"} Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.867533 4763 scope.go:117] "RemoveContainer" containerID="c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.868365 4763 scope.go:117] "RemoveContainer" containerID="5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f" Sep 30 13:36:21 crc kubenswrapper[4763]: E0930 13:36:21.868529 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.889884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.889940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.889951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.889971 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.889983 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:21Z","lastTransitionTime":"2025-09-30T13:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.892294 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.913046 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.934547 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c665b66c2e2306c5a0acad1abc4fa2158a78ef5de51befd54efeff615adf0933\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:05Z\\\",\\\"message\\\":\\\"94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563140 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:05.563151 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563126 6212 factory.go:656] Stopping watch factory\\\\nI0930 13:36:05.563212 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:05.563218 6212 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 13:36:05.563233 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:05.563295 6212 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:21Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI0930 13:36:21.355539 6440 factory.go:1336] Added *v1.Node event handler 7\\\\nI0930 13:36:21.355587 6440 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355660 6440 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:36:21.355730 6440 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:36:21.355780 6440 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:21.355806 6440 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:21.355834 6440 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:36:21.355861 6440 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:36:21.355885 6440 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355888 6440 factory.go:656] Stopping watch factory\\\\nI0930 13:36:21.355924 6440 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:36:21.356205 6440 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0930 13:36:21.356361 6440 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 13:36:21.356439 6440 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:21.356498 6440 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:21.356614 6440 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.952674 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.966940 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.986322 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.993188 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.993250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.993270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.993342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:21 crc kubenswrapper[4763]: I0930 13:36:21.993420 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:21Z","lastTransitionTime":"2025-09-30T13:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.003768 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.020137 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.036657 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.054285 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.071021 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.091309 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.097256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.097304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.097313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.097330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.097342 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:22Z","lastTransitionTime":"2025-09-30T13:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.109215 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.124650 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.148218 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.166088 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.200703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.200758 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.200777 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.201138 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.201182 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:22Z","lastTransitionTime":"2025-09-30T13:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.305982 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.307112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.307439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.307591 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.307783 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:22Z","lastTransitionTime":"2025-09-30T13:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.345488 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.410419 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.410528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.410548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.410572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.410587 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:22Z","lastTransitionTime":"2025-09-30T13:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.488763 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:22 crc kubenswrapper[4763]: E0930 13:36:22.489184 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.513875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.513940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.513959 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.513984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.514003 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:22Z","lastTransitionTime":"2025-09-30T13:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.617248 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.617316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.617341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.617376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.617397 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:22Z","lastTransitionTime":"2025-09-30T13:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.720728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.720805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.720827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.720855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.720881 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:22Z","lastTransitionTime":"2025-09-30T13:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.824660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.824742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.824767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.824802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.824974 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:22Z","lastTransitionTime":"2025-09-30T13:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.875307 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/2.log" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.881581 4763 scope.go:117] "RemoveContainer" containerID="5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f" Sep 30 13:36:22 crc kubenswrapper[4763]: E0930 13:36:22.881949 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.905127 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.923034 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.932405 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.932467 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.932486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.932516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.932535 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:22Z","lastTransitionTime":"2025-09-30T13:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.948296 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:22 crc kubenswrapper[4763]: I0930 13:36:22.999662 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.015038 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.036250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.036321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.036337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.036363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.036380 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:23Z","lastTransitionTime":"2025-09-30T13:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.042174 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.058651 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.071733 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.084898 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.097812 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.120019 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:21Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI0930 13:36:21.355539 6440 factory.go:1336] Added *v1.Node event handler 7\\\\nI0930 13:36:21.355587 6440 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355660 6440 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:36:21.355730 6440 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:36:21.355780 6440 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:21.355806 6440 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:21.355834 6440 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:36:21.355861 6440 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:36:21.355885 6440 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355888 6440 factory.go:656] Stopping watch factory\\\\nI0930 13:36:21.355924 6440 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:36:21.356205 6440 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0930 13:36:21.356361 6440 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 13:36:21.356439 6440 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:21.356498 6440 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:21.356614 6440 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.133701 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.138961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.139034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.139056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.139081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.139097 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:23Z","lastTransitionTime":"2025-09-30T13:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.147084 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.159422 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.170198 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.184506 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.241965 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.242227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.242243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.242267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.242290 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:23Z","lastTransitionTime":"2025-09-30T13:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.345861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.345942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.345964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.345995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.346015 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:23Z","lastTransitionTime":"2025-09-30T13:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.450099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.450193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.450225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.450256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.450275 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:23Z","lastTransitionTime":"2025-09-30T13:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.488953 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.488974 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.488974 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:23 crc kubenswrapper[4763]: E0930 13:36:23.489409 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:23 crc kubenswrapper[4763]: E0930 13:36:23.489185 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:23 crc kubenswrapper[4763]: E0930 13:36:23.489531 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.553287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.553350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.553366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.553394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.553410 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:23Z","lastTransitionTime":"2025-09-30T13:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.656966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.657044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.657058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.657083 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.657095 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:23Z","lastTransitionTime":"2025-09-30T13:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.760435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.760491 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.760507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.760527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.760546 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:23Z","lastTransitionTime":"2025-09-30T13:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.862773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.862832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.862843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.862862 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.862876 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:23Z","lastTransitionTime":"2025-09-30T13:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.966459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.966527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.966549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.966575 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:23 crc kubenswrapper[4763]: I0930 13:36:23.966639 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:23Z","lastTransitionTime":"2025-09-30T13:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.071475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.071531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.071544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.071570 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.071584 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.175339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.175387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.175400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.175422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.175436 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.278506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.278569 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.278585 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.278607 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.278620 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.381250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.381309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.381318 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.381338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.381348 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.407506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.407554 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.407566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.407587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.407626 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: E0930 13:36:24.429291 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.434153 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.434206 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.434221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.434246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.434262 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: E0930 13:36:24.451434 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.458164 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.458228 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.458246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.458274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.458293 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: E0930 13:36:24.477906 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.482860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.482904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.482919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.482943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.482958 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.489530 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:24 crc kubenswrapper[4763]: E0930 13:36:24.489808 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:24 crc kubenswrapper[4763]: E0930 13:36:24.511793 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.516946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.516995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.517006 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.517024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.517038 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: E0930 13:36:24.530930 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: E0930 13:36:24.531134 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.533327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.533380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.533393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.533413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.533430 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.548330 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:24 crc kubenswrapper[4763]: E0930 13:36:24.548513 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:24 crc kubenswrapper[4763]: E0930 13:36:24.548638 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs podName:394a12b5-37c3-4933-af17-71f5c84ec2fa nodeName:}" failed. No retries permitted until 2025-09-30 13:36:40.548578944 +0000 UTC m=+72.687139249 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs") pod "network-metrics-daemon-rggrv" (UID: "394a12b5-37c3-4933-af17-71f5c84ec2fa") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.637193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.637254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.637264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.637287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.637301 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.739945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.739998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.740009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.740033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.740063 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.842651 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.842698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.842709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.842726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.842737 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.852212 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.865796 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.865935 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.885283 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.900617 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.912476 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.929828 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.945366 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.946294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.946381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.946401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.946435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.946457 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:24Z","lastTransitionTime":"2025-09-30T13:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.964147 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:24 crc kubenswrapper[4763]: I0930 13:36:24.983091 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.008996 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:21Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI0930 13:36:21.355539 6440 factory.go:1336] Added *v1.Node event handler 7\\\\nI0930 13:36:21.355587 6440 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355660 6440 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:36:21.355730 6440 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:36:21.355780 6440 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:21.355806 6440 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:21.355834 6440 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:36:21.355861 6440 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:36:21.355885 6440 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355888 6440 factory.go:656] Stopping watch factory\\\\nI0930 13:36:21.355924 6440 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:36:21.356205 6440 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0930 13:36:21.356361 6440 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 13:36:21.356439 6440 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:21.356498 6440 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:21.356614 6440 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.024659 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.044124 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.049415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.049460 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.049477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.049499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.049516 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:25Z","lastTransitionTime":"2025-09-30T13:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.065630 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.082536 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.095451 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.110939 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.125978 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.152728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.152793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.152806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.152828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.152843 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:25Z","lastTransitionTime":"2025-09-30T13:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.256312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.256397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.256421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.256454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.256479 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:25Z","lastTransitionTime":"2025-09-30T13:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.359217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.359272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.359289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.359312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.359330 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:25Z","lastTransitionTime":"2025-09-30T13:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.463052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.463122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.463143 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.463176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.463198 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:25Z","lastTransitionTime":"2025-09-30T13:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.489326 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.489431 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.489527 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:25 crc kubenswrapper[4763]: E0930 13:36:25.489476 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:25 crc kubenswrapper[4763]: E0930 13:36:25.489574 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:25 crc kubenswrapper[4763]: E0930 13:36:25.489676 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.567117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.567161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.567171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.567187 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.567199 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:25Z","lastTransitionTime":"2025-09-30T13:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.669976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.670025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.670040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.670066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.670080 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:25Z","lastTransitionTime":"2025-09-30T13:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.772829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.772878 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.772893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.772913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.772928 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:25Z","lastTransitionTime":"2025-09-30T13:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.875880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.875940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.875952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.875971 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.875985 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:25Z","lastTransitionTime":"2025-09-30T13:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.978949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.979034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.979057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.979087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:25 crc kubenswrapper[4763]: I0930 13:36:25.979110 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:25Z","lastTransitionTime":"2025-09-30T13:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.082494 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.082547 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.082558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.082574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.082583 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:26Z","lastTransitionTime":"2025-09-30T13:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.186574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.186666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.186681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.186702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.186715 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:26Z","lastTransitionTime":"2025-09-30T13:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.290421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.290503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.290530 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.290563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.290585 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:26Z","lastTransitionTime":"2025-09-30T13:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.394961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.395024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.395035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.395063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.395085 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:26Z","lastTransitionTime":"2025-09-30T13:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.489405 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:26 crc kubenswrapper[4763]: E0930 13:36:26.489617 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.497507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.497693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.497727 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.497766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.497794 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:26Z","lastTransitionTime":"2025-09-30T13:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.601297 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.601386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.601406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.601437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.601458 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:26Z","lastTransitionTime":"2025-09-30T13:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.706956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.707033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.707049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.707075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.707092 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:26Z","lastTransitionTime":"2025-09-30T13:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.810712 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.810764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.810777 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.810797 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.810812 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:26Z","lastTransitionTime":"2025-09-30T13:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.913477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.913542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.913555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.913583 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:26 crc kubenswrapper[4763]: I0930 13:36:26.913615 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:26Z","lastTransitionTime":"2025-09-30T13:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.017428 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.017495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.017510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.017533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.017546 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:27Z","lastTransitionTime":"2025-09-30T13:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.120640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.120680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.120689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.120707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.120718 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:27Z","lastTransitionTime":"2025-09-30T13:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.224430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.224518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.224546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.224586 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.224654 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:27Z","lastTransitionTime":"2025-09-30T13:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.329197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.329280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.329310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.329348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.329374 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:27Z","lastTransitionTime":"2025-09-30T13:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.434058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.434144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.434169 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.434201 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.434230 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:27Z","lastTransitionTime":"2025-09-30T13:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.488898 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.488956 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.488903 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:27 crc kubenswrapper[4763]: E0930 13:36:27.489105 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:27 crc kubenswrapper[4763]: E0930 13:36:27.489310 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:27 crc kubenswrapper[4763]: E0930 13:36:27.489519 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.538155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.538215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.538225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.538246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.538305 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:27Z","lastTransitionTime":"2025-09-30T13:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.641207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.641254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.641264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.641281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.641291 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:27Z","lastTransitionTime":"2025-09-30T13:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.743955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.744017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.744035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.744052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.744066 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:27Z","lastTransitionTime":"2025-09-30T13:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.847315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.847386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.847397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.847416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.847436 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:27Z","lastTransitionTime":"2025-09-30T13:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.950136 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.950186 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.950200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.950220 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:27 crc kubenswrapper[4763]: I0930 13:36:27.950232 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:27Z","lastTransitionTime":"2025-09-30T13:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.054534 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.054644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.054665 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.054698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.054717 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:28Z","lastTransitionTime":"2025-09-30T13:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.158339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.158394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.158404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.158456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.158471 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:28Z","lastTransitionTime":"2025-09-30T13:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.262324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.262396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.262414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.262448 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.262473 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:28Z","lastTransitionTime":"2025-09-30T13:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.365190 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.365819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.365895 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.365925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.365954 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:28Z","lastTransitionTime":"2025-09-30T13:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.468766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.468814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.468826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.468845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.468858 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:28Z","lastTransitionTime":"2025-09-30T13:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.489428 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:28 crc kubenswrapper[4763]: E0930 13:36:28.489613 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.505722 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.519253 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.532221 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf0673c2-f0f3-4380-9228-b65c51f9184c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69768d25767b2d069d78da62764bb2be0c6c1c8f9b4378c499a20d0324fdb7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f882363620739f8700024600e56bc55742489a500c06f523fb9028dd2af5941f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c471b7d5edb6a2a0a1b7df018d846b4d54af48c83aa59b0067b9a98be067aa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.551701 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.569974 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.570957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.571001 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.571013 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.571034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.571050 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:28Z","lastTransitionTime":"2025-09-30T13:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.583425 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.595166 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.611575 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.626863 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.643520 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.657294 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.673490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.673535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.673553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.673579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.673621 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:28Z","lastTransitionTime":"2025-09-30T13:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.678006 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:21Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI0930 13:36:21.355539 6440 factory.go:1336] Added *v1.Node event handler 7\\\\nI0930 13:36:21.355587 6440 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355660 6440 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:36:21.355730 6440 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:36:21.355780 6440 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:21.355806 6440 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:21.355834 6440 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:36:21.355861 6440 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:36:21.355885 6440 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355888 6440 factory.go:656] Stopping watch factory\\\\nI0930 13:36:21.355924 6440 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:36:21.356205 6440 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0930 13:36:21.356361 6440 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 13:36:21.356439 6440 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:21.356498 6440 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:21.356614 6440 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.693815 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.711074 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.728842 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.745682 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.760343 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:28Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.776294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.776334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.776343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.776357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.776367 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:28Z","lastTransitionTime":"2025-09-30T13:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.880754 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.881221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.881235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.881256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.881270 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:28Z","lastTransitionTime":"2025-09-30T13:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.985524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.985585 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.985637 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.985686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:28 crc kubenswrapper[4763]: I0930 13:36:28.985719 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:28Z","lastTransitionTime":"2025-09-30T13:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.089426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.089493 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.089514 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.089544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.089567 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:29Z","lastTransitionTime":"2025-09-30T13:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.192826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.192896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.192915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.192951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.192972 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:29Z","lastTransitionTime":"2025-09-30T13:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.296633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.296697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.296710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.296732 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.296746 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:29Z","lastTransitionTime":"2025-09-30T13:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.401099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.401161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.401173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.401193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.401208 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:29Z","lastTransitionTime":"2025-09-30T13:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.488544 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.488718 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.488855 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:29 crc kubenswrapper[4763]: E0930 13:36:29.489013 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:29 crc kubenswrapper[4763]: E0930 13:36:29.489236 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:29 crc kubenswrapper[4763]: E0930 13:36:29.489395 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.504408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.504451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.504463 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.504482 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.504495 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:29Z","lastTransitionTime":"2025-09-30T13:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.607777 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.607868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.607888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.607915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.607933 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:29Z","lastTransitionTime":"2025-09-30T13:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.711794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.711890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.711911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.711939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.711961 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:29Z","lastTransitionTime":"2025-09-30T13:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.816592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.816704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.816731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.816768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.816798 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:29Z","lastTransitionTime":"2025-09-30T13:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.919587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.919663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.919675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.919693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:29 crc kubenswrapper[4763]: I0930 13:36:29.919705 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:29Z","lastTransitionTime":"2025-09-30T13:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.024341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.024444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.024473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.024509 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.024534 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:30Z","lastTransitionTime":"2025-09-30T13:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.128081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.128163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.128180 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.128686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.128731 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:30Z","lastTransitionTime":"2025-09-30T13:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.232593 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.232696 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.232716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.232748 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.232770 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:30Z","lastTransitionTime":"2025-09-30T13:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.337122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.337223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.337249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.337288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.337317 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:30Z","lastTransitionTime":"2025-09-30T13:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.441018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.441113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.441140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.441178 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.441211 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:30Z","lastTransitionTime":"2025-09-30T13:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.489185 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:30 crc kubenswrapper[4763]: E0930 13:36:30.490229 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.544853 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.544913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.544930 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.544954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.544968 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:30Z","lastTransitionTime":"2025-09-30T13:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.649344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.649430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.649446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.649481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.649500 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:30Z","lastTransitionTime":"2025-09-30T13:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.752898 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.752980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.753005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.753037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.753060 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:30Z","lastTransitionTime":"2025-09-30T13:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.856734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.856777 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.856814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.856832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.856842 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:30Z","lastTransitionTime":"2025-09-30T13:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.959942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.960011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.960045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.960076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:30 crc kubenswrapper[4763]: I0930 13:36:30.960088 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:30Z","lastTransitionTime":"2025-09-30T13:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.064726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.064786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.064805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.064830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.064849 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:31Z","lastTransitionTime":"2025-09-30T13:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.167946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.167997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.168010 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.168031 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.168045 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:31Z","lastTransitionTime":"2025-09-30T13:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.270618 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.270673 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.270685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.270701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.270712 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:31Z","lastTransitionTime":"2025-09-30T13:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.373559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.373624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.373641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.373685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.373701 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:31Z","lastTransitionTime":"2025-09-30T13:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.476954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.477009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.477025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.477046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.477059 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:31Z","lastTransitionTime":"2025-09-30T13:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.489432 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:31 crc kubenswrapper[4763]: E0930 13:36:31.489692 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.489958 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:31 crc kubenswrapper[4763]: E0930 13:36:31.490037 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.490171 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:31 crc kubenswrapper[4763]: E0930 13:36:31.490236 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.580713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.580755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.580766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.580783 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.580795 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:31Z","lastTransitionTime":"2025-09-30T13:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.687242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.687491 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.687501 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.687522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.687541 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:31Z","lastTransitionTime":"2025-09-30T13:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.791832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.791911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.791936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.791970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.791994 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:31Z","lastTransitionTime":"2025-09-30T13:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.896845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.896934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.896962 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.896995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:31 crc kubenswrapper[4763]: I0930 13:36:31.897020 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:31Z","lastTransitionTime":"2025-09-30T13:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.000729 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.000790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.000805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.000829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.000845 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:32Z","lastTransitionTime":"2025-09-30T13:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.104718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.104789 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.104804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.104824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.104836 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:32Z","lastTransitionTime":"2025-09-30T13:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.208909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.208976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.208990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.209012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.209025 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:32Z","lastTransitionTime":"2025-09-30T13:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.312668 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.312754 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.312778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.312807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.312827 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:32Z","lastTransitionTime":"2025-09-30T13:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.416871 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.416925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.416936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.416955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.416965 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:32Z","lastTransitionTime":"2025-09-30T13:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.489317 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:32 crc kubenswrapper[4763]: E0930 13:36:32.489531 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.519272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.519334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.519351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.519377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.519393 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:32Z","lastTransitionTime":"2025-09-30T13:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.622523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.622574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.622584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.622616 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.622631 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:32Z","lastTransitionTime":"2025-09-30T13:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.725980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.726040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.726057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.726080 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.726092 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:32Z","lastTransitionTime":"2025-09-30T13:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.828954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.829026 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.829044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.829072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.829088 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:32Z","lastTransitionTime":"2025-09-30T13:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.931840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.932563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.932580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.932652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:32 crc kubenswrapper[4763]: I0930 13:36:32.932668 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:32Z","lastTransitionTime":"2025-09-30T13:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.036188 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.036238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.036254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.036275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.036289 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:33Z","lastTransitionTime":"2025-09-30T13:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.139587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.139661 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.139673 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.139689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.139700 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:33Z","lastTransitionTime":"2025-09-30T13:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.241894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.241940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.241950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.241966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.241978 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:33Z","lastTransitionTime":"2025-09-30T13:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.345453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.345524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.345536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.345559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.345585 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:33Z","lastTransitionTime":"2025-09-30T13:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.448957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.449014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.449028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.449047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.449059 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:33Z","lastTransitionTime":"2025-09-30T13:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.488843 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.488907 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.489030 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:33 crc kubenswrapper[4763]: E0930 13:36:33.489023 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:33 crc kubenswrapper[4763]: E0930 13:36:33.489173 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:33 crc kubenswrapper[4763]: E0930 13:36:33.489282 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.551589 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.551648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.551658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.551675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.551689 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:33Z","lastTransitionTime":"2025-09-30T13:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.655162 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.655226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.655240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.655263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.655277 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:33Z","lastTransitionTime":"2025-09-30T13:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.759159 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.759245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.759259 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.759283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.759296 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:33Z","lastTransitionTime":"2025-09-30T13:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.861788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.861861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.861872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.861892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.861906 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:33Z","lastTransitionTime":"2025-09-30T13:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.964624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.964689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.964707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.964735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:33 crc kubenswrapper[4763]: I0930 13:36:33.964750 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:33Z","lastTransitionTime":"2025-09-30T13:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.067566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.067638 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.067650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.067674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.067693 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.170956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.171019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.171031 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.171052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.171071 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.274835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.274901 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.274918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.274942 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.274958 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.378043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.378108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.378128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.378155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.378179 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.481648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.481740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.481759 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.481781 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.481794 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.488935 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:34 crc kubenswrapper[4763]: E0930 13:36:34.489101 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.585035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.585133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.585147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.585174 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.585192 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.688186 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.688253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.688265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.688286 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.688297 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.695226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.695283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.695295 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.695321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.695337 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: E0930 13:36:34.710002 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.714540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.714624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.714641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.714660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.714675 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: E0930 13:36:34.728012 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.732873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.732928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.732943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.732966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.732981 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: E0930 13:36:34.746679 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.751748 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.751898 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.752015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.752133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.752240 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: E0930 13:36:34.764828 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.769130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.769290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.769415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.769532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.769647 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: E0930 13:36:34.782042 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:34 crc kubenswrapper[4763]: E0930 13:36:34.782165 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.791223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.791259 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.791272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.791291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.791304 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.894473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.894519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.894529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.894546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:34 crc kubenswrapper[4763]: I0930 13:36:34.894560 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:34Z","lastTransitionTime":"2025-09-30T13:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.000289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.000355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.000372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.000394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.000408 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:35Z","lastTransitionTime":"2025-09-30T13:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.103116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.103161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.103194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.103214 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.103225 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:35Z","lastTransitionTime":"2025-09-30T13:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.206498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.206613 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.206628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.206653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.206667 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:35Z","lastTransitionTime":"2025-09-30T13:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.311024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.311063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.311074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.311089 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.311099 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:35Z","lastTransitionTime":"2025-09-30T13:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.414093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.414137 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.414149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.414168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.414180 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:35Z","lastTransitionTime":"2025-09-30T13:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.489189 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:35 crc kubenswrapper[4763]: E0930 13:36:35.489393 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.489213 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:35 crc kubenswrapper[4763]: E0930 13:36:35.489538 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.489213 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:35 crc kubenswrapper[4763]: E0930 13:36:35.492702 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.517581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.517680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.517696 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.517719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.517734 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:35Z","lastTransitionTime":"2025-09-30T13:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.620766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.620835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.620849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.620872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.620889 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:35Z","lastTransitionTime":"2025-09-30T13:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.724364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.724423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.724436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.724458 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.724471 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:35Z","lastTransitionTime":"2025-09-30T13:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.827851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.827912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.827926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.827948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.827963 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:35Z","lastTransitionTime":"2025-09-30T13:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.931247 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.931441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.931459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.931483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:35 crc kubenswrapper[4763]: I0930 13:36:35.931495 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:35Z","lastTransitionTime":"2025-09-30T13:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.034512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.034583 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.034619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.034645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.034662 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:36Z","lastTransitionTime":"2025-09-30T13:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.138229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.138300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.138314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.138339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.138359 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:36Z","lastTransitionTime":"2025-09-30T13:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.240780 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.240827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.240841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.240859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.240874 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:36Z","lastTransitionTime":"2025-09-30T13:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.343437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.343498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.343511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.343530 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.343540 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:36Z","lastTransitionTime":"2025-09-30T13:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.445984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.446038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.446055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.446072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.446082 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:36Z","lastTransitionTime":"2025-09-30T13:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.489271 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:36 crc kubenswrapper[4763]: E0930 13:36:36.489429 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.548592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.548670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.548686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.548705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.548720 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:36Z","lastTransitionTime":"2025-09-30T13:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.651504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.651625 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.651645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.651669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.651683 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:36Z","lastTransitionTime":"2025-09-30T13:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.753586 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.753646 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.753658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.753674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.753686 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:36Z","lastTransitionTime":"2025-09-30T13:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.856128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.856175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.856186 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.856199 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.856207 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:36Z","lastTransitionTime":"2025-09-30T13:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.958321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.958371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.958385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.958401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:36 crc kubenswrapper[4763]: I0930 13:36:36.958413 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:36Z","lastTransitionTime":"2025-09-30T13:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.060974 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.061022 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.061036 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.061055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.061075 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:37Z","lastTransitionTime":"2025-09-30T13:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.164703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.164768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.164787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.164817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.164842 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:37Z","lastTransitionTime":"2025-09-30T13:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.267709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.267764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.267774 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.267792 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.267810 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:37Z","lastTransitionTime":"2025-09-30T13:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.371063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.371112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.371124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.371143 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.371155 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:37Z","lastTransitionTime":"2025-09-30T13:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.473305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.473344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.473353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.473372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.473384 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:37Z","lastTransitionTime":"2025-09-30T13:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.488793 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.488837 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.488804 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:37 crc kubenswrapper[4763]: E0930 13:36:37.489007 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:37 crc kubenswrapper[4763]: E0930 13:36:37.489380 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:37 crc kubenswrapper[4763]: E0930 13:36:37.489504 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.490078 4763 scope.go:117] "RemoveContainer" containerID="5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f" Sep 30 13:36:37 crc kubenswrapper[4763]: E0930 13:36:37.490351 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.576874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.576934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.576948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.576984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.577000 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:37Z","lastTransitionTime":"2025-09-30T13:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.680237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.680304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.680318 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.680342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.680356 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:37Z","lastTransitionTime":"2025-09-30T13:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.784156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.784208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.784218 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.784240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.784253 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:37Z","lastTransitionTime":"2025-09-30T13:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.887733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.887781 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.887803 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.887824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.887838 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:37Z","lastTransitionTime":"2025-09-30T13:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.990559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.990634 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.990650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.990672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:37 crc kubenswrapper[4763]: I0930 13:36:37.990687 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:37Z","lastTransitionTime":"2025-09-30T13:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.094122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.094185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.094197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.094222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.094236 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:38Z","lastTransitionTime":"2025-09-30T13:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.197330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.197391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.197412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.197438 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.197457 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:38Z","lastTransitionTime":"2025-09-30T13:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.301429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.301490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.301503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.301528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.301540 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:38Z","lastTransitionTime":"2025-09-30T13:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.406451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.406528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.406554 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.406590 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.406646 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:38Z","lastTransitionTime":"2025-09-30T13:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.488827 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:38 crc kubenswrapper[4763]: E0930 13:36:38.489231 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.506680 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.508920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.509006 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.509048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.509070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.509520 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:38Z","lastTransitionTime":"2025-09-30T13:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.520770 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.535187 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.547963 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.561464 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.583673 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.600745 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf0673c2-f0f3-4380-9228-b65c51f9184c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69768d25767b2d069d78da62764bb2be0c6c1c8f9b4378c499a20d0324fdb7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f882363620739f8700024600e56bc55742489a500c06f523fb9028dd2af5941f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c471b7d5edb6a2a0a1b7df018d846b4d54af48c83aa59b0067b9a98be067aa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.614540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.614632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.614649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.614671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.614690 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:38Z","lastTransitionTime":"2025-09-30T13:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.615184 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.627400 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.640222 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.651918 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.669500 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.683787 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.699091 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.715359 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.717232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.717289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.717305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.717328 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.717343 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:38Z","lastTransitionTime":"2025-09-30T13:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.740784 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:21Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI0930 13:36:21.355539 6440 factory.go:1336] Added *v1.Node event handler 7\\\\nI0930 13:36:21.355587 6440 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355660 6440 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:36:21.355730 6440 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:36:21.355780 6440 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:21.355806 6440 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:21.355834 6440 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:36:21.355861 6440 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:36:21.355885 6440 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355888 6440 factory.go:656] Stopping watch factory\\\\nI0930 13:36:21.355924 6440 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:36:21.356205 6440 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0930 13:36:21.356361 6440 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 13:36:21.356439 6440 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:21.356498 6440 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:21.356614 6440 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.754210 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.819617 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.819668 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.819683 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.819705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.819718 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:38Z","lastTransitionTime":"2025-09-30T13:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.923064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.923102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.923116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.923138 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:38 crc kubenswrapper[4763]: I0930 13:36:38.923151 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:38Z","lastTransitionTime":"2025-09-30T13:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.027867 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.027913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.027924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.027945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.027958 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:39Z","lastTransitionTime":"2025-09-30T13:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.131096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.131156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.131168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.131188 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.131202 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:39Z","lastTransitionTime":"2025-09-30T13:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.234156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.234214 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.234227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.234245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.234257 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:39Z","lastTransitionTime":"2025-09-30T13:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.337397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.337457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.337473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.337493 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.337505 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:39Z","lastTransitionTime":"2025-09-30T13:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.441278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.441334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.441350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.441373 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.441388 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:39Z","lastTransitionTime":"2025-09-30T13:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.489354 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:39 crc kubenswrapper[4763]: E0930 13:36:39.489498 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.489704 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:39 crc kubenswrapper[4763]: E0930 13:36:39.489750 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.489852 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:39 crc kubenswrapper[4763]: E0930 13:36:39.489894 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.543941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.544011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.544026 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.544051 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.544070 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:39Z","lastTransitionTime":"2025-09-30T13:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.647335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.647451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.647475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.647500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.647512 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:39Z","lastTransitionTime":"2025-09-30T13:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.750110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.750146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.750157 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.750173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.750182 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:39Z","lastTransitionTime":"2025-09-30T13:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.853056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.853116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.853130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.853152 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.853167 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:39Z","lastTransitionTime":"2025-09-30T13:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.956064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.956123 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.956135 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.956160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:39 crc kubenswrapper[4763]: I0930 13:36:39.956185 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:39Z","lastTransitionTime":"2025-09-30T13:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.060207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.060276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.060288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.060312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.060327 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:40Z","lastTransitionTime":"2025-09-30T13:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.164204 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.164260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.164270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.164295 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.164307 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:40Z","lastTransitionTime":"2025-09-30T13:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.268089 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.268155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.268165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.268183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.268195 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:40Z","lastTransitionTime":"2025-09-30T13:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.371344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.371464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.371486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.371521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.371540 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:40Z","lastTransitionTime":"2025-09-30T13:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.474974 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.475037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.475054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.475077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.475090 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:40Z","lastTransitionTime":"2025-09-30T13:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.488729 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:40 crc kubenswrapper[4763]: E0930 13:36:40.488886 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.552143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:40 crc kubenswrapper[4763]: E0930 13:36:40.552377 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:40 crc kubenswrapper[4763]: E0930 13:36:40.552515 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs podName:394a12b5-37c3-4933-af17-71f5c84ec2fa nodeName:}" failed. No retries permitted until 2025-09-30 13:37:12.552482172 +0000 UTC m=+104.691042447 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs") pod "network-metrics-daemon-rggrv" (UID: "394a12b5-37c3-4933-af17-71f5c84ec2fa") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.577708 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.577768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.577782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.577806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.577820 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:40Z","lastTransitionTime":"2025-09-30T13:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.681362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.681427 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.681441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.681465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.681478 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:40Z","lastTransitionTime":"2025-09-30T13:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.784960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.785004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.785015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.785035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.785046 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:40Z","lastTransitionTime":"2025-09-30T13:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.888553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.888631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.888652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.888679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.888700 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:40Z","lastTransitionTime":"2025-09-30T13:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.992191 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.992274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.992293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.992323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:40 crc kubenswrapper[4763]: I0930 13:36:40.992341 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:40Z","lastTransitionTime":"2025-09-30T13:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.095181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.095235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.095247 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.095266 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.095279 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:41Z","lastTransitionTime":"2025-09-30T13:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.198240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.198323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.198349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.198390 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.198411 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:41Z","lastTransitionTime":"2025-09-30T13:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.301403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.301455 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.301466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.301487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.301498 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:41Z","lastTransitionTime":"2025-09-30T13:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.410950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.411360 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.411672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.411771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.411850 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:41Z","lastTransitionTime":"2025-09-30T13:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.488841 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.488973 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.489051 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:41 crc kubenswrapper[4763]: E0930 13:36:41.489055 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:41 crc kubenswrapper[4763]: E0930 13:36:41.489232 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:41 crc kubenswrapper[4763]: E0930 13:36:41.489353 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.515922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.516323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.516425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.516503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.516573 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:41Z","lastTransitionTime":"2025-09-30T13:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.620324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.620364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.620376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.620395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.620407 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:41Z","lastTransitionTime":"2025-09-30T13:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.723826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.723888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.723902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.723926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.723942 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:41Z","lastTransitionTime":"2025-09-30T13:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.826994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.827063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.827079 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.827103 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.827119 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:41Z","lastTransitionTime":"2025-09-30T13:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.930156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.930813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.930857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.930886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:41 crc kubenswrapper[4763]: I0930 13:36:41.930903 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:41Z","lastTransitionTime":"2025-09-30T13:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.034257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.034294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.034306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.034326 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.034338 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:42Z","lastTransitionTime":"2025-09-30T13:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.137385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.137444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.137457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.137479 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.137496 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:42Z","lastTransitionTime":"2025-09-30T13:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.240310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.240384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.240400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.240426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.240443 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:42Z","lastTransitionTime":"2025-09-30T13:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.344016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.344061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.344073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.344093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.344107 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:42Z","lastTransitionTime":"2025-09-30T13:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.447430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.447469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.447481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.447499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.447510 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:42Z","lastTransitionTime":"2025-09-30T13:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.489078 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:42 crc kubenswrapper[4763]: E0930 13:36:42.489296 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.551370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.551420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.551431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.551451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.551461 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:42Z","lastTransitionTime":"2025-09-30T13:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.655005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.655081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.655102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.655133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.655154 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:42Z","lastTransitionTime":"2025-09-30T13:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.758246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.758293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.758303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.758321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.758331 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:42Z","lastTransitionTime":"2025-09-30T13:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.862123 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.862181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.862201 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.862231 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.862251 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:42Z","lastTransitionTime":"2025-09-30T13:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.966440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.966495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.966506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.966526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:42 crc kubenswrapper[4763]: I0930 13:36:42.966539 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:42Z","lastTransitionTime":"2025-09-30T13:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.070490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.070572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.070647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.070687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.070712 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:43Z","lastTransitionTime":"2025-09-30T13:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.175428 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.175517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.175541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.175992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.176041 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:43Z","lastTransitionTime":"2025-09-30T13:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.280055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.280134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.280154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.280181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.280201 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:43Z","lastTransitionTime":"2025-09-30T13:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.384439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.384483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.384494 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.384511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.384525 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:43Z","lastTransitionTime":"2025-09-30T13:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.486841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.486893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.486907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.486924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.486937 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:43Z","lastTransitionTime":"2025-09-30T13:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.488564 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.488578 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:43 crc kubenswrapper[4763]: E0930 13:36:43.488755 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.488783 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:43 crc kubenswrapper[4763]: E0930 13:36:43.488994 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:43 crc kubenswrapper[4763]: E0930 13:36:43.489231 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.589934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.589983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.589996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.590022 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.590042 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:43Z","lastTransitionTime":"2025-09-30T13:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.694250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.694316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.694335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.694378 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.694397 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:43Z","lastTransitionTime":"2025-09-30T13:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.799557 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.800022 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.800163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.800367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.800491 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:43Z","lastTransitionTime":"2025-09-30T13:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.903900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.903966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.903983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.904008 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.904025 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:43Z","lastTransitionTime":"2025-09-30T13:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.962764 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9qpw_766e1024-d943-4721-a366-83bc3635cc79/kube-multus/0.log" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.963251 4763 generic.go:334] "Generic (PLEG): container finished" podID="766e1024-d943-4721-a366-83bc3635cc79" containerID="821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2" exitCode=1 Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.963346 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9qpw" event={"ID":"766e1024-d943-4721-a366-83bc3635cc79","Type":"ContainerDied","Data":"821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2"} Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.964209 4763 scope.go:117] "RemoveContainer" containerID="821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2" Sep 30 13:36:43 crc kubenswrapper[4763]: I0930 13:36:43.990032 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.008464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.008522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.008542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.008569 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.008587 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:44Z","lastTransitionTime":"2025-09-30T13:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.009513 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.042011 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:21Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI0930 13:36:21.355539 6440 factory.go:1336] Added *v1.Node event handler 7\\\\nI0930 13:36:21.355587 6440 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355660 6440 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:36:21.355730 6440 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:36:21.355780 6440 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:21.355806 6440 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:21.355834 6440 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:36:21.355861 6440 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:36:21.355885 6440 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355888 6440 factory.go:656] Stopping watch factory\\\\nI0930 13:36:21.355924 6440 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:36:21.356205 6440 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0930 13:36:21.356361 6440 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 13:36:21.356439 6440 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:21.356498 6440 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:21.356614 6440 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.057281 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.070054 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.083648 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.107899 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:43Z\\\",\\\"message\\\":\\\"2025-09-30T13:35:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59\\\\n2025-09-30T13:35:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59 to /host/opt/cni/bin/\\\\n2025-09-30T13:35:58Z [verbose] multus-daemon started\\\\n2025-09-30T13:35:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:36:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.111976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.112020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.112030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.112046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.112057 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:44Z","lastTransitionTime":"2025-09-30T13:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.122871 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.137560 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.149705 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.161161 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf0673c2-f0f3-4380-9228-b65c51f9184c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69768d25767b2d069d78da62764bb2be0c6c1c8f9b4378c499a20d0324fdb7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f882363620739f8700024600e56bc55742489a500c06f523fb9028dd2af5941f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c471b7d5edb6a2a0a1b7df018d846b4d54af48c83aa59b0067b9a98be067aa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.172065 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.183423 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.193717 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.206136 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.214549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.214622 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.214635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.214655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.214665 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:44Z","lastTransitionTime":"2025-09-30T13:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.224611 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.235495 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.317505 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.317687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.317703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.317743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.317756 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:44Z","lastTransitionTime":"2025-09-30T13:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.420479 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.420520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.420531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.420546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.420556 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:44Z","lastTransitionTime":"2025-09-30T13:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.489589 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:44 crc kubenswrapper[4763]: E0930 13:36:44.489820 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.523046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.523115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.523135 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.523165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.523185 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:44Z","lastTransitionTime":"2025-09-30T13:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.626298 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.626346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.626357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.626378 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.626389 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:44Z","lastTransitionTime":"2025-09-30T13:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.729336 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.729407 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.729424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.729450 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.729466 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:44Z","lastTransitionTime":"2025-09-30T13:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.832929 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.832986 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.833008 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.833034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.833050 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:44Z","lastTransitionTime":"2025-09-30T13:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.936153 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.936208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.936219 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.936240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.936251 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:44Z","lastTransitionTime":"2025-09-30T13:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.969376 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9qpw_766e1024-d943-4721-a366-83bc3635cc79/kube-multus/0.log" Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.969446 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9qpw" event={"ID":"766e1024-d943-4721-a366-83bc3635cc79","Type":"ContainerStarted","Data":"2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e"} Sep 30 13:36:44 crc kubenswrapper[4763]: I0930 13:36:44.992716 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.018531 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.035551 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.040620 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.040656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.040669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.040691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.040706 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.048467 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.065102 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.080028 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.096316 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.110722 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.133915 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:21Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI0930 13:36:21.355539 6440 factory.go:1336] Added *v1.Node event handler 7\\\\nI0930 13:36:21.355587 6440 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355660 6440 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:36:21.355730 6440 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:36:21.355780 6440 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:21.355806 6440 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:21.355834 6440 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:36:21.355861 6440 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:36:21.355885 6440 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355888 6440 factory.go:656] Stopping watch factory\\\\nI0930 13:36:21.355924 6440 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:36:21.356205 6440 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0930 13:36:21.356361 6440 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 13:36:21.356439 6440 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:21.356498 6440 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:21.356614 6440 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.143656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.143716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.143728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.143753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.143768 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.146521 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.161254 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.168326 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.168382 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.168400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.168425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.168446 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.181026 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: E0930 13:36:45.183018 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.188188 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.188226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.188240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.188265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.188280 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.198495 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:43Z\\\",\\\"message\\\":\\\"2025-09-30T13:35:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59\\\\n2025-09-30T13:35:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59 to /host/opt/cni/bin/\\\\n2025-09-30T13:35:58Z [verbose] multus-daemon started\\\\n2025-09-30T13:35:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:36:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: E0930 13:36:45.206281 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.210374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.210413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.210424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.210442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.210454 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.214715 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: E0930 13:36:45.225383 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.230061 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.230262 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.230301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.230312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.230336 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.230353 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: E0930 13:36:45.242784 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.245368 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.247473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.247513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.247526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.247550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.247565 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.259276 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf0673c2-f0f3-4380-9228-b65c51f9184c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69768d25767b2d069d78da62764bb2be0c6c1c8f9b4378c499a20d0324fdb7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f882363620739f8700024600e56bc55742489a500c06f523fb9028dd2af5941f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c471b7d5edb6a2a0a1b7df018d846b4d54af48c83aa59b0067b9a98be067aa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: E0930 13:36:45.259910 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:45 crc kubenswrapper[4763]: E0930 13:36:45.260042 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.262010 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.262042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.262053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.262072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.262085 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.364708 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.364772 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.364784 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.364806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.364828 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.469238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.469326 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.469341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.469367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.469383 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.488700 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.488771 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.488732 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:45 crc kubenswrapper[4763]: E0930 13:36:45.488921 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:45 crc kubenswrapper[4763]: E0930 13:36:45.489067 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:45 crc kubenswrapper[4763]: E0930 13:36:45.489352 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.572759 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.572814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.572826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.572844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.572858 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.676192 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.676244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.676257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.676278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.676291 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.780136 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.780210 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.780238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.780273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.780297 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.892984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.893047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.893066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.893096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.893118 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.997383 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.997664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.997692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.997729 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:45 crc kubenswrapper[4763]: I0930 13:36:45.997753 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:45Z","lastTransitionTime":"2025-09-30T13:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.101106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.101182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.101207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.101244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.101268 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:46Z","lastTransitionTime":"2025-09-30T13:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.206120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.206209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.206233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.206262 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.206283 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:46Z","lastTransitionTime":"2025-09-30T13:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.309617 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.309877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.309925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.309954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.309971 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:46Z","lastTransitionTime":"2025-09-30T13:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.413837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.413902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.413915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.413937 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.413951 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:46Z","lastTransitionTime":"2025-09-30T13:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.488709 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:46 crc kubenswrapper[4763]: E0930 13:36:46.488870 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.519235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.519299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.519315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.519334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.519348 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:46Z","lastTransitionTime":"2025-09-30T13:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.622555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.622681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.622699 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.622722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.622735 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:46Z","lastTransitionTime":"2025-09-30T13:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.726592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.726671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.726687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.726708 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.726721 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:46Z","lastTransitionTime":"2025-09-30T13:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.830503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.830668 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.830700 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.830741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.830782 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:46Z","lastTransitionTime":"2025-09-30T13:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.934587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.934670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.934687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.934708 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:46 crc kubenswrapper[4763]: I0930 13:36:46.934721 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:46Z","lastTransitionTime":"2025-09-30T13:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.037580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.037688 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.037706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.037729 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.037745 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:47Z","lastTransitionTime":"2025-09-30T13:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.141359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.141439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.141465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.141491 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.141512 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:47Z","lastTransitionTime":"2025-09-30T13:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.244561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.244636 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.244654 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.244673 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.244685 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:47Z","lastTransitionTime":"2025-09-30T13:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.347904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.347957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.347970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.347994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.348006 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:47Z","lastTransitionTime":"2025-09-30T13:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.450769 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.450828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.450841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.450862 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.450880 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:47Z","lastTransitionTime":"2025-09-30T13:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.488935 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.488986 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.488935 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:47 crc kubenswrapper[4763]: E0930 13:36:47.489105 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:47 crc kubenswrapper[4763]: E0930 13:36:47.489302 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:47 crc kubenswrapper[4763]: E0930 13:36:47.489542 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.553375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.553435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.553449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.553479 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.553492 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:47Z","lastTransitionTime":"2025-09-30T13:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.656674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.656722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.656736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.656759 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.656777 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:47Z","lastTransitionTime":"2025-09-30T13:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.759819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.759883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.759900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.759923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.759941 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:47Z","lastTransitionTime":"2025-09-30T13:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.862642 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.862689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.862705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.862724 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.862734 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:47Z","lastTransitionTime":"2025-09-30T13:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.966233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.966307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.966331 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.966362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:47 crc kubenswrapper[4763]: I0930 13:36:47.966386 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:47Z","lastTransitionTime":"2025-09-30T13:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.069101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.069146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.069158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.069204 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.069216 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:48Z","lastTransitionTime":"2025-09-30T13:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.172886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.172939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.172949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.172969 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.172981 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:48Z","lastTransitionTime":"2025-09-30T13:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.277337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.277402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.277417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.277442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.277456 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:48Z","lastTransitionTime":"2025-09-30T13:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.381296 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.381344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.381354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.381373 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.381385 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:48Z","lastTransitionTime":"2025-09-30T13:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.485402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.485473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.485494 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.485524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.485545 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:48Z","lastTransitionTime":"2025-09-30T13:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.490273 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:48 crc kubenswrapper[4763]: E0930 13:36:48.490487 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.513569 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.531962 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:43Z\\\",\\\"message\\\":\\\"2025-09-30T13:35:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59\\\\n2025-09-30T13:35:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59 to /host/opt/cni/bin/\\\\n2025-09-30T13:35:58Z [verbose] multus-daemon started\\\\n2025-09-30T13:35:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:36:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.549375 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.567311 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.584577 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.588010 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.588050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.588070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.588094 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.588110 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:48Z","lastTransitionTime":"2025-09-30T13:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.600565 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf0673c2-f0f3-4380-9228-b65c51f9184c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69768d25767b2d069d78da62764bb2be0c6c1c8f9b4378c499a20d0324fdb7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f882363620739f8700024600e56bc55742489a500c06f523fb9028dd2af5941f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c471b7d5edb6a2a0a1b7df018d846b4d54af48c83aa59b0067b9a98be067aa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.615711 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.632803 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.647250 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.657998 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.676236 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.687642 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.692112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.692142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.692153 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.692172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.692184 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:48Z","lastTransitionTime":"2025-09-30T13:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.701292 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.715553 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.730572 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.753476 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:21Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI0930 13:36:21.355539 6440 factory.go:1336] Added *v1.Node event handler 7\\\\nI0930 13:36:21.355587 6440 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355660 6440 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:36:21.355730 6440 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:36:21.355780 6440 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:21.355806 6440 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:21.355834 6440 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:36:21.355861 6440 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:36:21.355885 6440 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355888 6440 factory.go:656] Stopping watch factory\\\\nI0930 13:36:21.355924 6440 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:36:21.356205 6440 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0930 13:36:21.356361 6440 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 13:36:21.356439 6440 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:21.356498 6440 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:21.356614 6440 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.767741 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:48Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.794546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.794616 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.794631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.794649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.794663 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:48Z","lastTransitionTime":"2025-09-30T13:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.897218 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.897271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.897289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.897314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:48 crc kubenswrapper[4763]: I0930 13:36:48.897334 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:48Z","lastTransitionTime":"2025-09-30T13:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.005779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.006129 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.006209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.006283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.006359 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:49Z","lastTransitionTime":"2025-09-30T13:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.109361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.109396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.109407 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.109424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.109434 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:49Z","lastTransitionTime":"2025-09-30T13:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.212502 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.212944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.213095 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.213183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.213262 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:49Z","lastTransitionTime":"2025-09-30T13:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.320083 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.320140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.320154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.320202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.320217 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:49Z","lastTransitionTime":"2025-09-30T13:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.424115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.424181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.424203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.424232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.424252 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:49Z","lastTransitionTime":"2025-09-30T13:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.489714 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.489769 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.489816 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:49 crc kubenswrapper[4763]: E0930 13:36:49.490558 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:49 crc kubenswrapper[4763]: E0930 13:36:49.490717 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:49 crc kubenswrapper[4763]: E0930 13:36:49.490846 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.528182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.528245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.528258 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.528281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.528302 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:49Z","lastTransitionTime":"2025-09-30T13:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.632058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.632134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.632157 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.632187 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.632207 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:49Z","lastTransitionTime":"2025-09-30T13:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.735973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.736064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.736093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.736133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.736158 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:49Z","lastTransitionTime":"2025-09-30T13:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.840714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.840778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.840790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.840810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.840821 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:49Z","lastTransitionTime":"2025-09-30T13:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.944264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.944326 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.944340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.944367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:49 crc kubenswrapper[4763]: I0930 13:36:49.944383 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:49Z","lastTransitionTime":"2025-09-30T13:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.047270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.047320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.047339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.047363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.047375 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:50Z","lastTransitionTime":"2025-09-30T13:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.150704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.150764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.150773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.150794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.150806 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:50Z","lastTransitionTime":"2025-09-30T13:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.253894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.254165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.254181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.254211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.254227 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:50Z","lastTransitionTime":"2025-09-30T13:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.357516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.357592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.357635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.357658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.357671 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:50Z","lastTransitionTime":"2025-09-30T13:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.460931 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.461009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.461024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.461052 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.461067 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:50Z","lastTransitionTime":"2025-09-30T13:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.489283 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:50 crc kubenswrapper[4763]: E0930 13:36:50.489521 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.564587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.564812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.564833 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.564853 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.564869 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:50Z","lastTransitionTime":"2025-09-30T13:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.668053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.668135 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.668159 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.668188 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.668209 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:50Z","lastTransitionTime":"2025-09-30T13:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.772393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.772439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.772450 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.772469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.772484 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:50Z","lastTransitionTime":"2025-09-30T13:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.875537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.875590 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.875626 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.875650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.875665 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:50Z","lastTransitionTime":"2025-09-30T13:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.979342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.979414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.979426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.979449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:50 crc kubenswrapper[4763]: I0930 13:36:50.979462 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:50Z","lastTransitionTime":"2025-09-30T13:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.082300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.082353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.082365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.082382 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.082394 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:51Z","lastTransitionTime":"2025-09-30T13:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.185362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.185403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.185413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.185427 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.185436 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:51Z","lastTransitionTime":"2025-09-30T13:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.282397 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.282806 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:55.28274012 +0000 UTC m=+147.421300455 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.289587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.289650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.289660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.289680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.289692 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:51Z","lastTransitionTime":"2025-09-30T13:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.383764 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.383837 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.383867 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.383933 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384060 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384137 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384092 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384169 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384193 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384251 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:37:55.384223839 +0000 UTC m=+147.522784134 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384249 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384288 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:37:55.38426165 +0000 UTC m=+147.522821975 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384305 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384334 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384400 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:37:55.384343142 +0000 UTC m=+147.522903697 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.384446 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:37:55.384425994 +0000 UTC m=+147.522986319 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.392879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.392936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.392961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.392995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.393014 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:51Z","lastTransitionTime":"2025-09-30T13:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.490178 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.490341 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.490181 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.490642 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.491384 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:51 crc kubenswrapper[4763]: E0930 13:36:51.491948 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.499093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.499157 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.499181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.499439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.499471 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:51Z","lastTransitionTime":"2025-09-30T13:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.602323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.602369 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.602378 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.602395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.602407 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:51Z","lastTransitionTime":"2025-09-30T13:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.705255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.705315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.705329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.705343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.705354 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:51Z","lastTransitionTime":"2025-09-30T13:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.809258 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.809349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.809371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.809402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.809424 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:51Z","lastTransitionTime":"2025-09-30T13:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.912680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.912736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.912747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.912768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:51 crc kubenswrapper[4763]: I0930 13:36:51.912778 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:51Z","lastTransitionTime":"2025-09-30T13:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.015200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.015249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.015262 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.015285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.015306 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:52Z","lastTransitionTime":"2025-09-30T13:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.119112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.119198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.119225 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.119261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.119290 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:52Z","lastTransitionTime":"2025-09-30T13:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.222412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.222470 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.222480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.222499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.222514 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:52Z","lastTransitionTime":"2025-09-30T13:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.325429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.325471 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.325480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.325496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.325506 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:52Z","lastTransitionTime":"2025-09-30T13:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.428928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.428987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.429002 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.429024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.429037 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:52Z","lastTransitionTime":"2025-09-30T13:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.489947 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:52 crc kubenswrapper[4763]: E0930 13:36:52.490746 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.490789 4763 scope.go:117] "RemoveContainer" containerID="5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.532431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.532490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.532505 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.532531 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.532546 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:52Z","lastTransitionTime":"2025-09-30T13:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.636264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.636322 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.636339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.636361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.636376 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:52Z","lastTransitionTime":"2025-09-30T13:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.738844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.738909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.738922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.738943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.738955 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:52Z","lastTransitionTime":"2025-09-30T13:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.843404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.843446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.843456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.843476 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.843489 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:52Z","lastTransitionTime":"2025-09-30T13:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.946569 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.946636 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.946648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.946666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.946680 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:52Z","lastTransitionTime":"2025-09-30T13:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:52 crc kubenswrapper[4763]: I0930 13:36:52.999428 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/2.log" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.002014 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158"} Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.003195 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.015887 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.026774 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.043413 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.049517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.049617 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.049642 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.049686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.049706 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:53Z","lastTransitionTime":"2025-09-30T13:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.056782 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.074230 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.096805 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.117875 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.131430 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.152130 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:21Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI0930 13:36:21.355539 6440 factory.go:1336] Added *v1.Node event handler 7\\\\nI0930 13:36:21.355587 6440 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355660 6440 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:36:21.355730 6440 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:36:21.355780 6440 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:21.355806 6440 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:21.355834 6440 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:36:21.355861 6440 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:36:21.355885 6440 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355888 6440 factory.go:656] Stopping watch factory\\\\nI0930 13:36:21.355924 6440 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:36:21.356205 6440 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0930 13:36:21.356361 6440 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 13:36:21.356439 6440 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:21.356498 6440 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:21.356614 6440 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.152758 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.152811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.152825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.152845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.152858 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:53Z","lastTransitionTime":"2025-09-30T13:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.171470 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.222777 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:43Z\\\",\\\"message\\\":\\\"2025-09-30T13:35:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59\\\\n2025-09-30T13:35:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59 to /host/opt/cni/bin/\\\\n2025-09-30T13:35:58Z [verbose] multus-daemon started\\\\n2025-09-30T13:35:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:36:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.246308 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.256116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.256167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.256183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.256204 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.256219 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:53Z","lastTransitionTime":"2025-09-30T13:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.263482 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.278848 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.292060 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf0673c2-f0f3-4380-9228-b65c51f9184c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69768d25767b2d069d78da62764bb2be0c6c1c8f9b4378c499a20d0324fdb7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f882363620739f8700024600e56bc55742489a500c06f523fb9028dd2af5941f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c471b7d5edb6a2a0a1b7df018d846b4d54af48c83aa59b0067b9a98be067aa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.309349 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.323690 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.359286 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.359338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.359354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.359373 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.359682 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:53Z","lastTransitionTime":"2025-09-30T13:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.462827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.462859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.462870 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.462882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.462891 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:53Z","lastTransitionTime":"2025-09-30T13:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.488309 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.488335 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.488422 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:53 crc kubenswrapper[4763]: E0930 13:36:53.488437 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:53 crc kubenswrapper[4763]: E0930 13:36:53.488583 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:53 crc kubenswrapper[4763]: E0930 13:36:53.489162 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.504056 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.565470 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.565524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.565537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.565553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.565565 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:53Z","lastTransitionTime":"2025-09-30T13:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.668962 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.669053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.669073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.669101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.669121 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:53Z","lastTransitionTime":"2025-09-30T13:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.772877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.772965 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.772988 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.773020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.773043 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:53Z","lastTransitionTime":"2025-09-30T13:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.876447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.876514 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.876533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.876559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.876579 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:53Z","lastTransitionTime":"2025-09-30T13:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.980234 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.980298 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.980313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.980339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:53 crc kubenswrapper[4763]: I0930 13:36:53.980352 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:53Z","lastTransitionTime":"2025-09-30T13:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.009503 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/3.log" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.010707 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/2.log" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.014721 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" exitCode=1 Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.014797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158"} Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.014862 4763 scope.go:117] "RemoveContainer" containerID="5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.015708 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:36:54 crc kubenswrapper[4763]: E0930 13:36:54.015878 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.041723 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.062675 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.081943 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf0673c2-f0f3-4380-9228-b65c51f9184c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69768d25767b2d069d78da62764bb2be0c6c1c8f9b4378c499a20d0324fdb7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f882363620739f8700024600e56bc55742489a500c06f523fb9028dd2af5941f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c471b7d5edb6a2a0a1b7df018d846b4d54af48c83aa59b0067b9a98be067aa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.083474 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.083503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.083517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.083535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.083549 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:54Z","lastTransitionTime":"2025-09-30T13:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.098366 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.114215 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.131207 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.147619 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.161234 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.182233 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.186434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.186462 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.186472 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.186492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.186504 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:54Z","lastTransitionTime":"2025-09-30T13:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.201893 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.216837 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f663835-af59-4255-b044-1c7219437176\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab73a8bdd6e33eb58327d87ab56400b259379d650b5e5f3b3c51e64d22beb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed9cc2883dbd039b8ec767fb752d1fa8c5533f80bf47d819598a6ac173959563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed9cc2883dbd039b8ec767fb752d1fa8c5533f80bf47d819598a6ac173959563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.235859 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.257710 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.285376 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a4992336500d0cbd03c544b497e0548152df8926d28e290e2f2412a4b2c243f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:21Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI0930 13:36:21.355539 6440 factory.go:1336] Added *v1.Node event handler 7\\\\nI0930 13:36:21.355587 6440 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355660 6440 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:36:21.355730 6440 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:36:21.355780 6440 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:36:21.355806 6440 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:36:21.355834 6440 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:36:21.355861 6440 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:36:21.355885 6440 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:36:21.355888 6440 factory.go:656] Stopping watch factory\\\\nI0930 13:36:21.355924 6440 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:36:21.356205 6440 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0930 13:36:21.356361 6440 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 13:36:21.356439 6440 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:36:21.356498 6440 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:36:21.356614 6440 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:53Z\\\",\\\"message\\\":\\\"t:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 13:36:53.346138 6823 services_controller.go:444] Built service openshift-marketplace/community-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0930 13:36:53.346146 6823 services_controller.go:445] Built service openshift-marketplace/community-operators LB template configs for network=default: []services.lbConfig(nil)\\\\nF0930 13:36:53.345935 6823 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.293783 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.294185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.294328 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.294429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.294506 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:54Z","lastTransitionTime":"2025-09-30T13:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.307327 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.326682 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.345008 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:43Z\\\",\\\"message\\\":\\\"2025-09-30T13:35:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59\\\\n2025-09-30T13:35:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59 to /host/opt/cni/bin/\\\\n2025-09-30T13:35:58Z [verbose] multus-daemon started\\\\n2025-09-30T13:35:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:36:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.361940 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.397490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.397530 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.397542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.397558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.397569 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:54Z","lastTransitionTime":"2025-09-30T13:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.489469 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:54 crc kubenswrapper[4763]: E0930 13:36:54.490351 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.501189 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.501233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.501250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.501271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.501290 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:54Z","lastTransitionTime":"2025-09-30T13:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.605029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.605101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.605130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.605163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.605189 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:54Z","lastTransitionTime":"2025-09-30T13:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.709156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.709240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.709262 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.709292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.709315 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:54Z","lastTransitionTime":"2025-09-30T13:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.812891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.812935 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.812944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.812960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.812973 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:54Z","lastTransitionTime":"2025-09-30T13:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.916232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.916338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.916360 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.916387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:54 crc kubenswrapper[4763]: I0930 13:36:54.916409 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:54Z","lastTransitionTime":"2025-09-30T13:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.018785 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.018841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.018856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.018877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.018891 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.020841 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/3.log" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.025746 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:36:55 crc kubenswrapper[4763]: E0930 13:36:55.025995 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.048541 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.064379 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.091323 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf0673c2-f0f3-4380-9228-b65c51f9184c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69768d25767b2d069d78da62764bb2be0c6c1c8f9b4378c499a20d0324fdb7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f882363620739f8700024600e56bc55742489a500c06f523fb9028dd2af5941f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c471b7d5edb6a2a0a1b7df018d846b4d54af48c83aa59b0067b9a98be067aa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.106421 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.121889 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.123213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.123231 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.123242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.123255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.123264 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.137353 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.155773 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.167036 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.190115 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.203004 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.214110 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f663835-af59-4255-b044-1c7219437176\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab73a8bdd6e33eb58327d87ab56400b259379d650b5e5f3b3c51e64d22beb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed9cc2883dbd039b8ec767fb752d1fa8c5533f80bf47d819598a6ac173959563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed9cc2883dbd039b8ec767fb752d1fa8c5533f80bf47d819598a6ac173959563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.227685 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.227742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.227760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.228177 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.228224 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.229685 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.245592 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.264302 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:53Z\\\",\\\"message\\\":\\\"t:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 13:36:53.346138 6823 services_controller.go:444] Built service openshift-marketplace/community-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0930 13:36:53.346146 6823 services_controller.go:445] Built service openshift-marketplace/community-operators LB template configs for network=default: []services.lbConfig(nil)\\\\nF0930 13:36:53.345935 6823 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.276819 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.291478 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.307244 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:43Z\\\",\\\"message\\\":\\\"2025-09-30T13:35:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59\\\\n2025-09-30T13:35:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59 to /host/opt/cni/bin/\\\\n2025-09-30T13:35:58Z [verbose] multus-daemon started\\\\n2025-09-30T13:35:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:36:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.322252 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.330683 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.330770 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.330806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.330828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.330841 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.433384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.433442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.433457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.433506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.433522 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.488403 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.488473 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.488406 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:55 crc kubenswrapper[4763]: E0930 13:36:55.488564 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:55 crc kubenswrapper[4763]: E0930 13:36:55.488690 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:55 crc kubenswrapper[4763]: E0930 13:36:55.488819 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.535866 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.535913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.535924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.535944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.535955 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.619538 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.619633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.619649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.619672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.619689 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: E0930 13:36:55.633876 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.639627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.639698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.639717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.639745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.639764 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: E0930 13:36:55.662053 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.667640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.667695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.667705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.667726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.667744 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: E0930 13:36:55.686484 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.691532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.691650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.691674 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.691704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.691725 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: E0930 13:36:55.708373 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.713618 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.713803 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.713959 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.714069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.714161 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: E0930 13:36:55.734787 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"87cb1e2c-9b8e-4ead-9950-c0bd55b572ab\\\",\\\"systemUUID\\\":\\\"aaaf82b4-c2c0-416a-9ead-be6eb519b6b5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:55 crc kubenswrapper[4763]: E0930 13:36:55.735141 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.737810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.737881 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.737901 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.737950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.737970 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.842659 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.842724 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.842740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.842768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.842785 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.946475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.946541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.946556 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.946576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:55 crc kubenswrapper[4763]: I0930 13:36:55.946588 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:55Z","lastTransitionTime":"2025-09-30T13:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.049570 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.050091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.050246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.050402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.050541 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:56Z","lastTransitionTime":"2025-09-30T13:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.153359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.153396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.153405 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.153419 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.153429 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:56Z","lastTransitionTime":"2025-09-30T13:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.256372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.256427 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.256442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.256466 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.256481 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:56Z","lastTransitionTime":"2025-09-30T13:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.362110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.362385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.362399 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.362425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.362440 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:56Z","lastTransitionTime":"2025-09-30T13:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.465777 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.465816 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.465826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.465843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.465854 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:56Z","lastTransitionTime":"2025-09-30T13:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.489412 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:56 crc kubenswrapper[4763]: E0930 13:36:56.489737 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.570090 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.570177 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.570200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.570227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.570249 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:56Z","lastTransitionTime":"2025-09-30T13:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.674246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.674332 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.674358 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.674392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.674416 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:56Z","lastTransitionTime":"2025-09-30T13:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.778361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.778437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.778458 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.778489 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.778513 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:56Z","lastTransitionTime":"2025-09-30T13:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.882238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.882820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.882979 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.883149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.883356 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:56Z","lastTransitionTime":"2025-09-30T13:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.987696 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.987766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.987782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.987802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:56 crc kubenswrapper[4763]: I0930 13:36:56.987815 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:56Z","lastTransitionTime":"2025-09-30T13:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.091541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.091684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.091706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.091738 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.091760 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:57Z","lastTransitionTime":"2025-09-30T13:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.195124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.195299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.195330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.195365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.195392 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:57Z","lastTransitionTime":"2025-09-30T13:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.298776 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.298858 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.298871 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.298896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.298910 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:57Z","lastTransitionTime":"2025-09-30T13:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.402729 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.402792 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.402807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.402826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.402838 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:57Z","lastTransitionTime":"2025-09-30T13:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.488738 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.488891 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.488739 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:57 crc kubenswrapper[4763]: E0930 13:36:57.489001 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:57 crc kubenswrapper[4763]: E0930 13:36:57.489192 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:57 crc kubenswrapper[4763]: E0930 13:36:57.489332 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.507106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.507144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.507155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.507172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.507186 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:57Z","lastTransitionTime":"2025-09-30T13:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.610976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.611025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.611041 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.611063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.611080 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:57Z","lastTransitionTime":"2025-09-30T13:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.713793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.713864 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.713880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.713907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.713925 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:57Z","lastTransitionTime":"2025-09-30T13:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.817331 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.817397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.817409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.817429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.817445 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:57Z","lastTransitionTime":"2025-09-30T13:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.920145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.920180 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.920193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.920211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:57 crc kubenswrapper[4763]: I0930 13:36:57.920221 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:57Z","lastTransitionTime":"2025-09-30T13:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.022912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.022984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.022995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.023014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.023024 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:58Z","lastTransitionTime":"2025-09-30T13:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.126270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.126338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.126351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.126422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.126434 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:58Z","lastTransitionTime":"2025-09-30T13:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.229583 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.229658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.229673 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.229698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.229713 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:58Z","lastTransitionTime":"2025-09-30T13:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.332832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.332890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.332904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.332926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.332941 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:58Z","lastTransitionTime":"2025-09-30T13:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.435658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.435720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.435734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.435756 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.435773 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:58Z","lastTransitionTime":"2025-09-30T13:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.488992 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:36:58 crc kubenswrapper[4763]: E0930 13:36:58.489207 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.513093 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.532685 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a30894a03186d6cd00bd81b883866939aa13d302735900c5d99988d561bc8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.538771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.538825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.538845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.538873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.538893 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:58Z","lastTransitionTime":"2025-09-30T13:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.553383 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c9qpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766e1024-d943-4721-a366-83bc3635cc79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:43Z\\\",\\\"message\\\":\\\"2025-09-30T13:35:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59\\\\n2025-09-30T13:35:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_806aaed9-41b7-4a34-948d-969e3bd75d59 to /host/opt/cni/bin/\\\\n2025-09-30T13:35:58Z [verbose] multus-daemon started\\\\n2025-09-30T13:35:58Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:36:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4zb6p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c9qpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.567172 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc0ba969-357e-406f-bf02-4e01f260d447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9de0547f7630369b591c36a5cc13a268ee8bd9e1938ad5677d11b8dadc079a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://470403cc73509089548ea03687ca7c2dad7a10097b4c3864976f6d8e3f237e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4dsgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.583753 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b59ad3e-5591-44dd-b444-4209fb40510a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a52725b7989dab420f5be6c77cb174d5db1a74a712a33da3a87df1aa4bafcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e5c6f520be26e3734c2b54de3aaa88cfa24df706ca7ba551bc8646164ab98a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://776156055a5bafee1595b80ee91370e83ef0dacb06d60cb5b9fc7ecd0169a745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5a3f129e2d5b357965f5030169d4d163f67f024dee5991d8e122af119535b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a200b65b8ad2567356c408803139d7110bfacf302f268f1f48de96f98843e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:35:47.235158 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:35:47.235297 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:35:47.237221 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3982949090/tls.crt::/tmp/serving-cert-3982949090/tls.key\\\\\\\"\\\\nI0930 13:35:47.424049 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:35:47.432044 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:35:47.432073 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:35:47.432100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:35:47.432106 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:35:47.445518 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:35:47.445645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445671 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:35:47.445692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:35:47.445716 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:35:47.445736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:35:47.445757 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:35:47.446043 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:35:47.452830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d0e931d5f74c4d0f2f660cceb8ef7410f44d9fa1ad317abbe5f6c75e8947f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a1834e9bb59f3f0f3bea209dbd970d47b9bfec5b4d0d53323faa75ea65f6298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.600394 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36b70051-c37a-4582-af6a-ee820ad8de92\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a04ecb731bd053ac5ff3120987dbfbb37956a5f7bbfca51bfb5c735532aa4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ecee7bc35763ac367b3315b02d09e4c68b8673aaa48efbf8fd7f916fc40d2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee44eb6136fc13d578209e9b963341515e170248db8f84e765f213b511d6e898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af51b8772db5fc58e881aeda3c49107c5356c6ad71504253424f8f0047e0ecef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.617964 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf0673c2-f0f3-4380-9228-b65c51f9184c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69768d25767b2d069d78da62764bb2be0c6c1c8f9b4378c499a20d0324fdb7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f882363620739f8700024600e56bc55742489a500c06f523fb9028dd2af5941f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c471b7d5edb6a2a0a1b7df018d846b4d54af48c83aa59b0067b9a98be067aa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3741b2f33a2ceeffeb55e0757aa9b0c67ef33394319b418df95812207d9a00bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.633462 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.641790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.641836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.641849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.641873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.641888 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:58Z","lastTransitionTime":"2025-09-30T13:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.647930 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.659016 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354dd4ee03e33ad153e4ab5246985c6b90459076d12927e3ef250d08b1d9a30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.670233 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l26sn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894b8880-d853-4f58-8be7-d5db22b85f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01514c48c11cac4715523f1493f659e39bedc3f8ba8d30347d664769442a50ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlhg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l26sn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.690270 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10c96750-42ea-4ae1-b6ae-abd96e614336\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1353cf0edafd8e8a108f6e5309bcfce7752135c4ffc5f18a10408148380e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7d60bdf9a79334035fbb82b1ae6d94b4ae8afd18dee9be27a78a98aacf37f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18f0c71d3f1c8de54f2277a437fa82b1fc66e938fbef474e24ae1ac4fc2d783b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f7cb722238a9bce81d33b95e57d38730e6566abefba77967b4453defa3ff438\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81adf252b1a3cf1bdbdae727a88b3b4526b0423dd093382bfdd2034fef0e341d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7692b14ab64f457f0c7f3b44442a0428fa68d71d7543f419859c26a6d76be81f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b99da21044570f7cc00f3a7d1456e0f5033b039759ad49fe82685f454076d55c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:36:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7blnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5fjhf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.703126 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-prttr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3af8022-cedc-4a5e-90e7-7110e1716c14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f658ecc707d3e6799b7b5ad0491c76ba8d534989ffa3e62af9d1d4fae5e3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftjrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:57Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-prttr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.714486 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f663835-af59-4255-b044-1c7219437176\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab73a8bdd6e33eb58327d87ab56400b259379d650b5e5f3b3c51e64d22beb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed9cc2883dbd039b8ec767fb752d1fa8c5533f80bf47d819598a6ac173959563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed9cc2883dbd039b8ec767fb752d1fa8c5533f80bf47d819598a6ac173959563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.727648 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39edd5b33487b860dc1a238e4508fc24ecdb0cc680826202eab418ce1fc56bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4091ee3e697bab80213ae834a71128757aea11353f7e1c20a70693b5b0a82489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.740783 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3789557-abc5-4243-9049-4afe8717cdf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7735a367b6ef7c0c6bfc0284178094381fdcf6a892b28776ee76afa35e4a8e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtwqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-49jns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.745131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.745161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.745171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.745187 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.745199 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:58Z","lastTransitionTime":"2025-09-30T13:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.762002 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da518be6-b52d-4130-aab2-f27bfd4f9571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:35:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:36:53Z\\\",\\\"message\\\":\\\"t:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 13:36:53.346138 6823 services_controller.go:444] Built service openshift-marketplace/community-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0930 13:36:53.346146 6823 services_controller.go:445] Built service openshift-marketplace/community-operators LB template configs for network=default: []services.lbConfig(nil)\\\\nF0930 13:36:53.345935 6823 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:53Z is after 2025\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:36:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:35:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:35:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6prbc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:35:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5rtn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.773923 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rggrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"394a12b5-37c3-4933-af17-71f5c84ec2fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:36:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vdcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:36:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rggrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:36:58Z is after 2025-08-24T17:21:41Z" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.849150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.849254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.849286 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.849325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.849350 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:58Z","lastTransitionTime":"2025-09-30T13:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.952402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.952899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.952913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.952932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:58 crc kubenswrapper[4763]: I0930 13:36:58.952943 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:58Z","lastTransitionTime":"2025-09-30T13:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.057095 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.057163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.057184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.057209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.057228 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:59Z","lastTransitionTime":"2025-09-30T13:36:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.162032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.162447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.162572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.162785 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.162991 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:59Z","lastTransitionTime":"2025-09-30T13:36:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.266551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.267073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.267164 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.267255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.267371 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:59Z","lastTransitionTime":"2025-09-30T13:36:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.371548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.371652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.371680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.371713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.371739 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:59Z","lastTransitionTime":"2025-09-30T13:36:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.475199 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.475274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.475293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.475321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.475342 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:59Z","lastTransitionTime":"2025-09-30T13:36:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.488788 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.488901 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.488948 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:36:59 crc kubenswrapper[4763]: E0930 13:36:59.489139 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:36:59 crc kubenswrapper[4763]: E0930 13:36:59.489307 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:36:59 crc kubenswrapper[4763]: E0930 13:36:59.489693 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.578303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.578372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.578399 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.578426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.578449 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:59Z","lastTransitionTime":"2025-09-30T13:36:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.681785 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.681860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.681885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.681919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.681945 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:59Z","lastTransitionTime":"2025-09-30T13:36:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.784864 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.784896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.784907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.784921 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.784930 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:59Z","lastTransitionTime":"2025-09-30T13:36:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.887372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.887432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.887447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.887475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.887492 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:59Z","lastTransitionTime":"2025-09-30T13:36:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.990701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.990772 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.990786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.990810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:36:59 crc kubenswrapper[4763]: I0930 13:36:59.990824 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:36:59Z","lastTransitionTime":"2025-09-30T13:36:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.094021 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.094090 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.094110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.094140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.094161 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:00Z","lastTransitionTime":"2025-09-30T13:37:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.200511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.200582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.200621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.200650 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.200665 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:00Z","lastTransitionTime":"2025-09-30T13:37:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.304384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.304446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.304464 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.304592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.304634 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:00Z","lastTransitionTime":"2025-09-30T13:37:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.408100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.408868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.408948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.409104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.409197 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:00Z","lastTransitionTime":"2025-09-30T13:37:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.489055 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:00 crc kubenswrapper[4763]: E0930 13:37:00.489290 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.511563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.511973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.512123 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.512266 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.512399 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:00Z","lastTransitionTime":"2025-09-30T13:37:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.615807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.615883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.615908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.615939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.615964 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:00Z","lastTransitionTime":"2025-09-30T13:37:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.719647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.719713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.719734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.719755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.719768 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:00Z","lastTransitionTime":"2025-09-30T13:37:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.822970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.823364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.823457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.823561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.823676 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:00Z","lastTransitionTime":"2025-09-30T13:37:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.927195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.927277 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.927306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.927408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:00 crc kubenswrapper[4763]: I0930 13:37:00.927433 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:00Z","lastTransitionTime":"2025-09-30T13:37:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.031575 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.031684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.031697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.031718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.031732 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:01Z","lastTransitionTime":"2025-09-30T13:37:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.135391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.135444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.135463 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.135497 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.135517 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:01Z","lastTransitionTime":"2025-09-30T13:37:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.238687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.238776 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.238802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.238837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.238858 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:01Z","lastTransitionTime":"2025-09-30T13:37:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.342500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.342575 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.342622 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.342651 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.342677 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:01Z","lastTransitionTime":"2025-09-30T13:37:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.446280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.446388 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.446413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.446448 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.446473 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:01Z","lastTransitionTime":"2025-09-30T13:37:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.489286 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.489286 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.489550 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:01 crc kubenswrapper[4763]: E0930 13:37:01.489798 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:01 crc kubenswrapper[4763]: E0930 13:37:01.490092 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:01 crc kubenswrapper[4763]: E0930 13:37:01.490932 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.550388 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.550442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.550454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.550477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.550500 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:01Z","lastTransitionTime":"2025-09-30T13:37:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.653737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.653782 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.653793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.653812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.653824 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:01Z","lastTransitionTime":"2025-09-30T13:37:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.756913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.756952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.756961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.756975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.756987 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:01Z","lastTransitionTime":"2025-09-30T13:37:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.861649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.861721 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.861741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.861768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.861787 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:01Z","lastTransitionTime":"2025-09-30T13:37:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.967363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.967452 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.967497 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.967532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:01 crc kubenswrapper[4763]: I0930 13:37:01.967556 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:01Z","lastTransitionTime":"2025-09-30T13:37:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.070789 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.070867 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.070882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.070932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.070947 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:02Z","lastTransitionTime":"2025-09-30T13:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.173874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.173941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.173953 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.173983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.174003 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:02Z","lastTransitionTime":"2025-09-30T13:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.278267 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.278350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.278370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.278414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.278435 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:02Z","lastTransitionTime":"2025-09-30T13:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.382728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.382823 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.382842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.382873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.382897 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:02Z","lastTransitionTime":"2025-09-30T13:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.486844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.486902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.486912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.486933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.486943 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:02Z","lastTransitionTime":"2025-09-30T13:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.488550 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:02 crc kubenswrapper[4763]: E0930 13:37:02.488893 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.590525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.590581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.590623 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.590648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.590661 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:02Z","lastTransitionTime":"2025-09-30T13:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.694880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.694944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.694964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.694990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.695009 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:02Z","lastTransitionTime":"2025-09-30T13:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.798931 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.799014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.799034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.799063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.799083 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:02Z","lastTransitionTime":"2025-09-30T13:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.903671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.903783 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.903822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.903861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:02 crc kubenswrapper[4763]: I0930 13:37:02.903883 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:02Z","lastTransitionTime":"2025-09-30T13:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.007855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.007930 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.007949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.007978 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.007997 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:03Z","lastTransitionTime":"2025-09-30T13:37:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.112076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.112155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.112179 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.112213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.112238 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:03Z","lastTransitionTime":"2025-09-30T13:37:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.215966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.216048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.216067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.216096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.216116 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:03Z","lastTransitionTime":"2025-09-30T13:37:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.320379 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.320454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.320473 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.320507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.320526 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:03Z","lastTransitionTime":"2025-09-30T13:37:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.423759 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.423840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.423859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.423888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.423907 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:03Z","lastTransitionTime":"2025-09-30T13:37:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.489274 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:03 crc kubenswrapper[4763]: E0930 13:37:03.489663 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.490108 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:03 crc kubenswrapper[4763]: E0930 13:37:03.490203 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.490385 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:03 crc kubenswrapper[4763]: E0930 13:37:03.490485 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.527520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.527568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.527582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.527619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.527636 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:03Z","lastTransitionTime":"2025-09-30T13:37:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.630577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.630662 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.630682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.630711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.630731 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:03Z","lastTransitionTime":"2025-09-30T13:37:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.733558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.733772 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.733798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.733826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.733852 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:03Z","lastTransitionTime":"2025-09-30T13:37:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.837823 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.837884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.837905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.837934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.837954 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:03Z","lastTransitionTime":"2025-09-30T13:37:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.940932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.941025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.941045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.941071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:03 crc kubenswrapper[4763]: I0930 13:37:03.941090 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:03Z","lastTransitionTime":"2025-09-30T13:37:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.045161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.045236 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.045256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.045285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.045310 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:04Z","lastTransitionTime":"2025-09-30T13:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.149120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.149182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.149203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.149270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.149289 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:04Z","lastTransitionTime":"2025-09-30T13:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.253669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.253747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.253765 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.253795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.253824 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:04Z","lastTransitionTime":"2025-09-30T13:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.357456 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.357529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.357547 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.357582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.357657 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:04Z","lastTransitionTime":"2025-09-30T13:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.460994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.461033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.461043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.461058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.461068 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:04Z","lastTransitionTime":"2025-09-30T13:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.489000 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:04 crc kubenswrapper[4763]: E0930 13:37:04.489265 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.594045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.594100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.594112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.594130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.594142 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:04Z","lastTransitionTime":"2025-09-30T13:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.696588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.696681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.696692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.696707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.696717 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:04Z","lastTransitionTime":"2025-09-30T13:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.799268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.799347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.799371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.799401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.799420 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:04Z","lastTransitionTime":"2025-09-30T13:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.902984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.903033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.903043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.903061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:04 crc kubenswrapper[4763]: I0930 13:37:04.903070 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:04Z","lastTransitionTime":"2025-09-30T13:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.006546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.006627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.006641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.006663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.006675 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:05Z","lastTransitionTime":"2025-09-30T13:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.109717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.109778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.109794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.109815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.109828 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:05Z","lastTransitionTime":"2025-09-30T13:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.212640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.212702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.212719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.212742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.212759 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:05Z","lastTransitionTime":"2025-09-30T13:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.317347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.317413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.317424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.317443 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.317459 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:05Z","lastTransitionTime":"2025-09-30T13:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.420813 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.420915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.420932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.420981 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.420999 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:05Z","lastTransitionTime":"2025-09-30T13:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.488705 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.488802 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:05 crc kubenswrapper[4763]: E0930 13:37:05.488855 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.488923 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:05 crc kubenswrapper[4763]: E0930 13:37:05.488945 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:05 crc kubenswrapper[4763]: E0930 13:37:05.489169 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.525237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.525320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.525339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.525364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.525378 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:05Z","lastTransitionTime":"2025-09-30T13:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.627865 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.627925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.627943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.627966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.627984 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:05Z","lastTransitionTime":"2025-09-30T13:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.730827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.730870 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.730883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.730904 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.730917 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:05Z","lastTransitionTime":"2025-09-30T13:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.833183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.833224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.833235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.833253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.833265 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:05Z","lastTransitionTime":"2025-09-30T13:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.906023 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.906070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.906080 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.906099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.906109 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:37:05Z","lastTransitionTime":"2025-09-30T13:37:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.965058 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk"] Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.965630 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.970174 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.970300 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.970638 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 13:37:05 crc kubenswrapper[4763]: I0930 13:37:05.972736 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.022539 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=13.022509273 podStartE2EDuration="13.022509273s" podCreationTimestamp="2025-09-30 13:36:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:06.022158565 +0000 UTC m=+98.160718860" watchObservedRunningTime="2025-09-30 13:37:06.022509273 +0000 UTC m=+98.161069568" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.061717 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d57166b-ed58-4354-82a6-80a265a7db57-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.062084 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d57166b-ed58-4354-82a6-80a265a7db57-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.062215 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d57166b-ed58-4354-82a6-80a265a7db57-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.062314 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2d57166b-ed58-4354-82a6-80a265a7db57-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.062434 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2d57166b-ed58-4354-82a6-80a265a7db57-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.065403 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podStartSLOduration=72.06536511 podStartE2EDuration="1m12.06536511s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:06.064937971 +0000 UTC m=+98.203498266" watchObservedRunningTime="2025-09-30 13:37:06.06536511 +0000 UTC m=+98.203925435" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.139628 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-c9qpw" podStartSLOduration=72.139584275 podStartE2EDuration="1m12.139584275s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:06.13934323 +0000 UTC m=+98.277903525" watchObservedRunningTime="2025-09-30 13:37:06.139584275 +0000 UTC m=+98.278144560" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.162762 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4dsgn" podStartSLOduration=72.162735007 podStartE2EDuration="1m12.162735007s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:06.162724256 +0000 UTC m=+98.301284541" watchObservedRunningTime="2025-09-30 13:37:06.162735007 +0000 UTC m=+98.301295292" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.163103 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d57166b-ed58-4354-82a6-80a265a7db57-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.163130 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d57166b-ed58-4354-82a6-80a265a7db57-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.163162 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d57166b-ed58-4354-82a6-80a265a7db57-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.163185 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2d57166b-ed58-4354-82a6-80a265a7db57-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.163232 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2d57166b-ed58-4354-82a6-80a265a7db57-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.163311 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2d57166b-ed58-4354-82a6-80a265a7db57-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.163386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2d57166b-ed58-4354-82a6-80a265a7db57-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.164023 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d57166b-ed58-4354-82a6-80a265a7db57-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.169651 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d57166b-ed58-4354-82a6-80a265a7db57-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.183446 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d57166b-ed58-4354-82a6-80a265a7db57-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dvlsk\" (UID: \"2d57166b-ed58-4354-82a6-80a265a7db57\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.189665 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.189638725 podStartE2EDuration="1m19.189638725s" podCreationTimestamp="2025-09-30 13:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:06.189389778 +0000 UTC m=+98.327950083" watchObservedRunningTime="2025-09-30 13:37:06.189638725 +0000 UTC m=+98.328199010" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.206142 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.206108965 podStartE2EDuration="1m19.206108965s" podCreationTimestamp="2025-09-30 13:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:06.205228586 +0000 UTC m=+98.343788871" watchObservedRunningTime="2025-09-30 13:37:06.206108965 +0000 UTC m=+98.344669250" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.235912 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.235894168 podStartE2EDuration="42.235894168s" podCreationTimestamp="2025-09-30 13:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:06.219272363 +0000 UTC m=+98.357832648" watchObservedRunningTime="2025-09-30 13:37:06.235894168 +0000 UTC m=+98.374454453" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.236459 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5fjhf" podStartSLOduration=72.236455061 podStartE2EDuration="1m12.236455061s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:06.236111763 +0000 UTC m=+98.374672078" watchObservedRunningTime="2025-09-30 13:37:06.236455061 +0000 UTC m=+98.375015346" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.250770 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-prttr" podStartSLOduration=72.250740373 podStartE2EDuration="1m12.250740373s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:06.250381705 +0000 UTC m=+98.388941990" watchObservedRunningTime="2025-09-30 13:37:06.250740373 +0000 UTC m=+98.389300658" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.283286 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.307951 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l26sn" podStartSLOduration=72.307933013 podStartE2EDuration="1m12.307933013s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:06.307874672 +0000 UTC m=+98.446434977" watchObservedRunningTime="2025-09-30 13:37:06.307933013 +0000 UTC m=+98.446493298" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.489544 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:06 crc kubenswrapper[4763]: E0930 13:37:06.489742 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:06 crc kubenswrapper[4763]: I0930 13:37:06.491061 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:37:06 crc kubenswrapper[4763]: E0930 13:37:06.491226 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" Sep 30 13:37:07 crc kubenswrapper[4763]: I0930 13:37:07.071976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" event={"ID":"2d57166b-ed58-4354-82a6-80a265a7db57","Type":"ContainerStarted","Data":"9d0caf985f6dad26c42da84e5aa2a5f792088f5003e23ae2edbc549636367a21"} Sep 30 13:37:07 crc kubenswrapper[4763]: I0930 13:37:07.072099 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" event={"ID":"2d57166b-ed58-4354-82a6-80a265a7db57","Type":"ContainerStarted","Data":"83dd1d9677f1fb817ea19608546c0777c7de59a57c00d7a742034509155e0736"} Sep 30 13:37:07 crc kubenswrapper[4763]: I0930 13:37:07.098097 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dvlsk" podStartSLOduration=73.098037542 podStartE2EDuration="1m13.098037542s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:07.097201504 +0000 UTC m=+99.235761789" watchObservedRunningTime="2025-09-30 13:37:07.098037542 +0000 UTC m=+99.236597837" Sep 30 13:37:07 crc kubenswrapper[4763]: I0930 13:37:07.489389 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:07 crc kubenswrapper[4763]: I0930 13:37:07.489541 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:07 crc kubenswrapper[4763]: E0930 13:37:07.490309 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:07 crc kubenswrapper[4763]: I0930 13:37:07.490029 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:07 crc kubenswrapper[4763]: E0930 13:37:07.490490 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:07 crc kubenswrapper[4763]: E0930 13:37:07.490544 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:07 crc kubenswrapper[4763]: I0930 13:37:07.515774 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 13:37:08 crc kubenswrapper[4763]: I0930 13:37:08.490875 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:08 crc kubenswrapper[4763]: E0930 13:37:08.494584 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:08 crc kubenswrapper[4763]: I0930 13:37:08.536084 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.53598736 podStartE2EDuration="1.53598736s" podCreationTimestamp="2025-09-30 13:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:08.533537126 +0000 UTC m=+100.672097421" watchObservedRunningTime="2025-09-30 13:37:08.53598736 +0000 UTC m=+100.674547665" Sep 30 13:37:09 crc kubenswrapper[4763]: I0930 13:37:09.489362 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:09 crc kubenswrapper[4763]: I0930 13:37:09.489465 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:09 crc kubenswrapper[4763]: I0930 13:37:09.489505 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:09 crc kubenswrapper[4763]: E0930 13:37:09.489551 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:09 crc kubenswrapper[4763]: E0930 13:37:09.489688 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:09 crc kubenswrapper[4763]: E0930 13:37:09.489843 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:10 crc kubenswrapper[4763]: I0930 13:37:10.489027 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:10 crc kubenswrapper[4763]: E0930 13:37:10.489183 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:11 crc kubenswrapper[4763]: I0930 13:37:11.489011 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:11 crc kubenswrapper[4763]: I0930 13:37:11.489024 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:11 crc kubenswrapper[4763]: E0930 13:37:11.489223 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:11 crc kubenswrapper[4763]: I0930 13:37:11.489011 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:11 crc kubenswrapper[4763]: E0930 13:37:11.489439 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:11 crc kubenswrapper[4763]: E0930 13:37:11.489695 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:12 crc kubenswrapper[4763]: I0930 13:37:12.488872 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:12 crc kubenswrapper[4763]: E0930 13:37:12.489572 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:12 crc kubenswrapper[4763]: I0930 13:37:12.642963 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:12 crc kubenswrapper[4763]: E0930 13:37:12.643180 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:37:12 crc kubenswrapper[4763]: E0930 13:37:12.643235 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs podName:394a12b5-37c3-4933-af17-71f5c84ec2fa nodeName:}" failed. No retries permitted until 2025-09-30 13:38:16.643219613 +0000 UTC m=+168.781779898 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs") pod "network-metrics-daemon-rggrv" (UID: "394a12b5-37c3-4933-af17-71f5c84ec2fa") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:37:13 crc kubenswrapper[4763]: I0930 13:37:13.489149 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:13 crc kubenswrapper[4763]: I0930 13:37:13.489196 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:13 crc kubenswrapper[4763]: I0930 13:37:13.489261 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:13 crc kubenswrapper[4763]: E0930 13:37:13.489451 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:13 crc kubenswrapper[4763]: E0930 13:37:13.489634 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:13 crc kubenswrapper[4763]: E0930 13:37:13.489757 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:14 crc kubenswrapper[4763]: I0930 13:37:14.488718 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:14 crc kubenswrapper[4763]: E0930 13:37:14.489409 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:15 crc kubenswrapper[4763]: I0930 13:37:15.489553 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:15 crc kubenswrapper[4763]: I0930 13:37:15.489721 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:15 crc kubenswrapper[4763]: I0930 13:37:15.489742 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:15 crc kubenswrapper[4763]: E0930 13:37:15.489853 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:15 crc kubenswrapper[4763]: E0930 13:37:15.489975 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:15 crc kubenswrapper[4763]: E0930 13:37:15.490234 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:16 crc kubenswrapper[4763]: I0930 13:37:16.489510 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:16 crc kubenswrapper[4763]: E0930 13:37:16.489888 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:17 crc kubenswrapper[4763]: I0930 13:37:17.489079 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:17 crc kubenswrapper[4763]: I0930 13:37:17.489220 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:17 crc kubenswrapper[4763]: I0930 13:37:17.489118 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:17 crc kubenswrapper[4763]: E0930 13:37:17.489350 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:17 crc kubenswrapper[4763]: E0930 13:37:17.489540 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:17 crc kubenswrapper[4763]: E0930 13:37:17.489772 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:18 crc kubenswrapper[4763]: I0930 13:37:18.489019 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:18 crc kubenswrapper[4763]: E0930 13:37:18.491257 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:19 crc kubenswrapper[4763]: I0930 13:37:19.488639 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:19 crc kubenswrapper[4763]: I0930 13:37:19.488703 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:19 crc kubenswrapper[4763]: I0930 13:37:19.488786 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:19 crc kubenswrapper[4763]: E0930 13:37:19.488884 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:19 crc kubenswrapper[4763]: E0930 13:37:19.489044 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:19 crc kubenswrapper[4763]: E0930 13:37:19.489336 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:20 crc kubenswrapper[4763]: I0930 13:37:20.489185 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:20 crc kubenswrapper[4763]: E0930 13:37:20.489436 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:21 crc kubenswrapper[4763]: I0930 13:37:21.489377 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:21 crc kubenswrapper[4763]: I0930 13:37:21.489442 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:21 crc kubenswrapper[4763]: I0930 13:37:21.489412 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:21 crc kubenswrapper[4763]: E0930 13:37:21.489588 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:21 crc kubenswrapper[4763]: E0930 13:37:21.489798 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:21 crc kubenswrapper[4763]: E0930 13:37:21.491197 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:21 crc kubenswrapper[4763]: I0930 13:37:21.491891 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:37:21 crc kubenswrapper[4763]: E0930 13:37:21.492217 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" Sep 30 13:37:22 crc kubenswrapper[4763]: I0930 13:37:22.489302 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:22 crc kubenswrapper[4763]: E0930 13:37:22.489675 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:23 crc kubenswrapper[4763]: I0930 13:37:23.489264 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:23 crc kubenswrapper[4763]: I0930 13:37:23.489294 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:23 crc kubenswrapper[4763]: E0930 13:37:23.489551 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:23 crc kubenswrapper[4763]: I0930 13:37:23.489393 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:23 crc kubenswrapper[4763]: E0930 13:37:23.489660 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:23 crc kubenswrapper[4763]: E0930 13:37:23.489893 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:24 crc kubenswrapper[4763]: I0930 13:37:24.489082 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:24 crc kubenswrapper[4763]: E0930 13:37:24.489939 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:25 crc kubenswrapper[4763]: I0930 13:37:25.488926 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:25 crc kubenswrapper[4763]: I0930 13:37:25.488991 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:25 crc kubenswrapper[4763]: I0930 13:37:25.489118 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:25 crc kubenswrapper[4763]: E0930 13:37:25.489266 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:25 crc kubenswrapper[4763]: E0930 13:37:25.489459 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:25 crc kubenswrapper[4763]: E0930 13:37:25.489726 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:26 crc kubenswrapper[4763]: I0930 13:37:26.488945 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:26 crc kubenswrapper[4763]: E0930 13:37:26.489237 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:27 crc kubenswrapper[4763]: I0930 13:37:27.489245 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:27 crc kubenswrapper[4763]: I0930 13:37:27.489272 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:27 crc kubenswrapper[4763]: I0930 13:37:27.489269 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:27 crc kubenswrapper[4763]: E0930 13:37:27.490285 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:27 crc kubenswrapper[4763]: E0930 13:37:27.490399 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:27 crc kubenswrapper[4763]: E0930 13:37:27.489947 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:28 crc kubenswrapper[4763]: E0930 13:37:28.455562 4763 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 13:37:28 crc kubenswrapper[4763]: I0930 13:37:28.488951 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:28 crc kubenswrapper[4763]: E0930 13:37:28.490872 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:28 crc kubenswrapper[4763]: E0930 13:37:28.580063 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 13:37:29 crc kubenswrapper[4763]: I0930 13:37:29.491594 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:29 crc kubenswrapper[4763]: I0930 13:37:29.491693 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:29 crc kubenswrapper[4763]: I0930 13:37:29.491670 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:29 crc kubenswrapper[4763]: E0930 13:37:29.491907 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:29 crc kubenswrapper[4763]: E0930 13:37:29.492080 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:29 crc kubenswrapper[4763]: E0930 13:37:29.492834 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:30 crc kubenswrapper[4763]: I0930 13:37:30.159542 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9qpw_766e1024-d943-4721-a366-83bc3635cc79/kube-multus/1.log" Sep 30 13:37:30 crc kubenswrapper[4763]: I0930 13:37:30.160639 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9qpw_766e1024-d943-4721-a366-83bc3635cc79/kube-multus/0.log" Sep 30 13:37:30 crc kubenswrapper[4763]: I0930 13:37:30.160726 4763 generic.go:334] "Generic (PLEG): container finished" podID="766e1024-d943-4721-a366-83bc3635cc79" containerID="2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e" exitCode=1 Sep 30 13:37:30 crc kubenswrapper[4763]: I0930 13:37:30.160783 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9qpw" event={"ID":"766e1024-d943-4721-a366-83bc3635cc79","Type":"ContainerDied","Data":"2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e"} Sep 30 13:37:30 crc kubenswrapper[4763]: I0930 13:37:30.160845 4763 scope.go:117] "RemoveContainer" containerID="821c6767cc5f0f333c6b944b3f26b5815d4d5ad9c47c5dc4c68e370ab72275e2" Sep 30 13:37:30 crc kubenswrapper[4763]: I0930 13:37:30.161801 4763 scope.go:117] "RemoveContainer" containerID="2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e" Sep 30 13:37:30 crc kubenswrapper[4763]: E0930 13:37:30.162439 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-c9qpw_openshift-multus(766e1024-d943-4721-a366-83bc3635cc79)\"" pod="openshift-multus/multus-c9qpw" podUID="766e1024-d943-4721-a366-83bc3635cc79" Sep 30 13:37:30 crc kubenswrapper[4763]: I0930 13:37:30.489299 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:30 crc kubenswrapper[4763]: E0930 13:37:30.490139 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:31 crc kubenswrapper[4763]: I0930 13:37:31.167523 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9qpw_766e1024-d943-4721-a366-83bc3635cc79/kube-multus/1.log" Sep 30 13:37:31 crc kubenswrapper[4763]: I0930 13:37:31.488823 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:31 crc kubenswrapper[4763]: I0930 13:37:31.488835 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:31 crc kubenswrapper[4763]: E0930 13:37:31.489088 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:31 crc kubenswrapper[4763]: I0930 13:37:31.488865 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:31 crc kubenswrapper[4763]: E0930 13:37:31.489317 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:31 crc kubenswrapper[4763]: E0930 13:37:31.489541 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:32 crc kubenswrapper[4763]: I0930 13:37:32.489112 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:32 crc kubenswrapper[4763]: E0930 13:37:32.489515 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:32 crc kubenswrapper[4763]: I0930 13:37:32.491015 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:37:32 crc kubenswrapper[4763]: E0930 13:37:32.491314 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5rtn6_openshift-ovn-kubernetes(da518be6-b52d-4130-aab2-f27bfd4f9571)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" Sep 30 13:37:33 crc kubenswrapper[4763]: I0930 13:37:33.489183 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:33 crc kubenswrapper[4763]: I0930 13:37:33.489218 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:33 crc kubenswrapper[4763]: I0930 13:37:33.489208 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:33 crc kubenswrapper[4763]: E0930 13:37:33.489372 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:33 crc kubenswrapper[4763]: E0930 13:37:33.489455 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:33 crc kubenswrapper[4763]: E0930 13:37:33.489562 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:33 crc kubenswrapper[4763]: E0930 13:37:33.581393 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 13:37:34 crc kubenswrapper[4763]: I0930 13:37:34.488778 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:34 crc kubenswrapper[4763]: E0930 13:37:34.489027 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:35 crc kubenswrapper[4763]: I0930 13:37:35.488912 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:35 crc kubenswrapper[4763]: I0930 13:37:35.488992 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:35 crc kubenswrapper[4763]: I0930 13:37:35.488926 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:35 crc kubenswrapper[4763]: E0930 13:37:35.489160 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:35 crc kubenswrapper[4763]: E0930 13:37:35.489318 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:35 crc kubenswrapper[4763]: E0930 13:37:35.489430 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:36 crc kubenswrapper[4763]: I0930 13:37:36.488540 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:36 crc kubenswrapper[4763]: E0930 13:37:36.488790 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:37 crc kubenswrapper[4763]: I0930 13:37:37.489420 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:37 crc kubenswrapper[4763]: I0930 13:37:37.489511 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:37 crc kubenswrapper[4763]: I0930 13:37:37.489567 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:37 crc kubenswrapper[4763]: E0930 13:37:37.489754 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:37 crc kubenswrapper[4763]: E0930 13:37:37.489880 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:37 crc kubenswrapper[4763]: E0930 13:37:37.489668 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:38 crc kubenswrapper[4763]: I0930 13:37:38.489319 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:38 crc kubenswrapper[4763]: E0930 13:37:38.491850 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:38 crc kubenswrapper[4763]: E0930 13:37:38.583034 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 13:37:39 crc kubenswrapper[4763]: I0930 13:37:39.488648 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:39 crc kubenswrapper[4763]: E0930 13:37:39.488843 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:39 crc kubenswrapper[4763]: I0930 13:37:39.489041 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:39 crc kubenswrapper[4763]: E0930 13:37:39.489293 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:39 crc kubenswrapper[4763]: I0930 13:37:39.489057 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:39 crc kubenswrapper[4763]: E0930 13:37:39.489938 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:40 crc kubenswrapper[4763]: I0930 13:37:40.489006 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:40 crc kubenswrapper[4763]: E0930 13:37:40.489240 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:41 crc kubenswrapper[4763]: I0930 13:37:41.488545 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:41 crc kubenswrapper[4763]: I0930 13:37:41.488571 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:41 crc kubenswrapper[4763]: E0930 13:37:41.488848 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:41 crc kubenswrapper[4763]: E0930 13:37:41.489016 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:41 crc kubenswrapper[4763]: I0930 13:37:41.489313 4763 scope.go:117] "RemoveContainer" containerID="2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e" Sep 30 13:37:41 crc kubenswrapper[4763]: I0930 13:37:41.489448 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:41 crc kubenswrapper[4763]: E0930 13:37:41.489588 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:42 crc kubenswrapper[4763]: I0930 13:37:42.220842 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9qpw_766e1024-d943-4721-a366-83bc3635cc79/kube-multus/1.log" Sep 30 13:37:42 crc kubenswrapper[4763]: I0930 13:37:42.220966 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9qpw" event={"ID":"766e1024-d943-4721-a366-83bc3635cc79","Type":"ContainerStarted","Data":"5b43269fae80af1d4f4436c691aca5e5984ef49e50d5581da67884a6052cbef2"} Sep 30 13:37:42 crc kubenswrapper[4763]: I0930 13:37:42.489281 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:42 crc kubenswrapper[4763]: E0930 13:37:42.489576 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:43 crc kubenswrapper[4763]: I0930 13:37:43.489478 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:43 crc kubenswrapper[4763]: I0930 13:37:43.489571 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:43 crc kubenswrapper[4763]: E0930 13:37:43.489802 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:43 crc kubenswrapper[4763]: E0930 13:37:43.489925 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:43 crc kubenswrapper[4763]: I0930 13:37:43.489974 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:43 crc kubenswrapper[4763]: E0930 13:37:43.490131 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:43 crc kubenswrapper[4763]: E0930 13:37:43.584474 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 13:37:44 crc kubenswrapper[4763]: I0930 13:37:44.489136 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:44 crc kubenswrapper[4763]: E0930 13:37:44.489467 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:45 crc kubenswrapper[4763]: I0930 13:37:45.488566 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:45 crc kubenswrapper[4763]: E0930 13:37:45.489170 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:45 crc kubenswrapper[4763]: I0930 13:37:45.488694 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:45 crc kubenswrapper[4763]: E0930 13:37:45.489277 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:45 crc kubenswrapper[4763]: I0930 13:37:45.488694 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:45 crc kubenswrapper[4763]: E0930 13:37:45.489359 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:46 crc kubenswrapper[4763]: I0930 13:37:46.489124 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:46 crc kubenswrapper[4763]: E0930 13:37:46.489372 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:46 crc kubenswrapper[4763]: I0930 13:37:46.490521 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:37:47 crc kubenswrapper[4763]: I0930 13:37:47.242951 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/3.log" Sep 30 13:37:47 crc kubenswrapper[4763]: I0930 13:37:47.245400 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerStarted","Data":"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91"} Sep 30 13:37:47 crc kubenswrapper[4763]: I0930 13:37:47.245959 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:37:47 crc kubenswrapper[4763]: I0930 13:37:47.277795 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podStartSLOduration=113.277768432 podStartE2EDuration="1m53.277768432s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:37:47.276729538 +0000 UTC m=+139.415289843" watchObservedRunningTime="2025-09-30 13:37:47.277768432 +0000 UTC m=+139.416328717" Sep 30 13:37:47 crc kubenswrapper[4763]: I0930 13:37:47.488744 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:47 crc kubenswrapper[4763]: I0930 13:37:47.488786 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:47 crc kubenswrapper[4763]: I0930 13:37:47.488749 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:47 crc kubenswrapper[4763]: E0930 13:37:47.488992 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:47 crc kubenswrapper[4763]: E0930 13:37:47.489089 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:47 crc kubenswrapper[4763]: E0930 13:37:47.489242 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:47 crc kubenswrapper[4763]: I0930 13:37:47.519197 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rggrv"] Sep 30 13:37:47 crc kubenswrapper[4763]: I0930 13:37:47.519427 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:47 crc kubenswrapper[4763]: E0930 13:37:47.519697 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:48 crc kubenswrapper[4763]: E0930 13:37:48.585698 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 13:37:49 crc kubenswrapper[4763]: I0930 13:37:49.488499 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:49 crc kubenswrapper[4763]: I0930 13:37:49.488662 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:49 crc kubenswrapper[4763]: E0930 13:37:49.489062 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:49 crc kubenswrapper[4763]: I0930 13:37:49.488783 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:49 crc kubenswrapper[4763]: E0930 13:37:49.489306 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:49 crc kubenswrapper[4763]: I0930 13:37:49.488730 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:49 crc kubenswrapper[4763]: E0930 13:37:49.489471 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:49 crc kubenswrapper[4763]: E0930 13:37:49.489538 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:51 crc kubenswrapper[4763]: I0930 13:37:51.488777 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:51 crc kubenswrapper[4763]: I0930 13:37:51.488853 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:51 crc kubenswrapper[4763]: I0930 13:37:51.488853 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:51 crc kubenswrapper[4763]: I0930 13:37:51.488881 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:51 crc kubenswrapper[4763]: E0930 13:37:51.489006 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:51 crc kubenswrapper[4763]: E0930 13:37:51.489242 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:51 crc kubenswrapper[4763]: E0930 13:37:51.489373 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:51 crc kubenswrapper[4763]: E0930 13:37:51.489571 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:52 crc kubenswrapper[4763]: I0930 13:37:52.376839 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:37:53 crc kubenswrapper[4763]: I0930 13:37:53.488773 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:53 crc kubenswrapper[4763]: I0930 13:37:53.488830 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:53 crc kubenswrapper[4763]: I0930 13:37:53.488877 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:53 crc kubenswrapper[4763]: I0930 13:37:53.489059 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:53 crc kubenswrapper[4763]: E0930 13:37:53.489089 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:37:53 crc kubenswrapper[4763]: E0930 13:37:53.489212 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:37:53 crc kubenswrapper[4763]: E0930 13:37:53.489381 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rggrv" podUID="394a12b5-37c3-4933-af17-71f5c84ec2fa" Sep 30 13:37:53 crc kubenswrapper[4763]: E0930 13:37:53.489569 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.309516 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.309929 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:39:57.309899311 +0000 UTC m=+269.448459636 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.410198 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.410257 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.410278 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.410315 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.410395 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.410464 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:57.410446515 +0000 UTC m=+269.549006800 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.410531 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.410561 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.410651 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.410840 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.410653 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.410969 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.411001 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.410792 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:57.410750122 +0000 UTC m=+269.549310477 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.411133 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:57.41109382 +0000 UTC m=+269.549654175 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:37:55 crc kubenswrapper[4763]: E0930 13:37:55.411154 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:57.411146861 +0000 UTC m=+269.549707276 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.488555 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.488691 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.488595 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.488565 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.492987 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.493063 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.493115 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.492988 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.493194 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 13:37:55 crc kubenswrapper[4763]: I0930 13:37:55.493223 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.742264 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.808996 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9rcjp"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.809640 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.812733 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6ts49"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.813858 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.814080 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.814457 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.815818 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.816505 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.817379 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.818029 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.821730 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.822800 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jqx2k"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.824823 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pmcm\" (UniqueName: \"kubernetes.io/projected/c86dd83f-1362-49d9-aecb-9e86cb66ebcd-kube-api-access-7pmcm\") pod \"cluster-samples-operator-665b6dd947-l7pfj\" (UID: \"c86dd83f-1362-49d9-aecb-9e86cb66ebcd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.824910 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c86dd83f-1362-49d9-aecb-9e86cb66ebcd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l7pfj\" (UID: \"c86dd83f-1362-49d9-aecb-9e86cb66ebcd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.825680 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9eaed9c6-6995-4062-8c6b-a41853220149-serving-cert\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.830468 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-config\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.830577 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-client-ca\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.830669 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.831015 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.831392 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.844009 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.864004 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.872154 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.872848 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.873121 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.873446 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.873472 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.873862 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.873956 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.877882 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.878115 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.878363 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.878512 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.878747 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.879056 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.879231 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.879399 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.879740 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.879902 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.880048 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.880194 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.881219 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.881497 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.881723 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.881896 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.882111 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.882375 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.882735 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.882962 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.883095 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.883135 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.883259 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.883285 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.883404 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.883596 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.883768 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.883929 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.884271 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.884430 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.887897 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vjdsr"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.888358 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-64zqx"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.889457 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.892879 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-p5rvt"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.893268 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f7g5q"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.894741 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.894848 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.895101 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.911492 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.917182 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.917640 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.929174 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.929455 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.930040 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.930202 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.930390 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.931728 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.932224 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.932467 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9rcjp"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933119 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-client-ca\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933161 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-dir\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933180 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-serving-cert\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f98df76-283e-4a40-8985-e876b83119ce-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wrz4k\" (UID: \"2f98df76-283e-4a40-8985-e876b83119ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933216 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5khf\" (UniqueName: \"kubernetes.io/projected/ee6b454d-afd6-400e-8f72-1880a5485abf-kube-api-access-w5khf\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933231 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-trusted-ca-bundle\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933249 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-audit-dir\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933262 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4464e7-c998-48f2-bac7-bf0da585931e-config\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933304 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-config\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933340 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2f4l\" (UniqueName: \"kubernetes.io/projected/9c7d4b69-3286-49c0-8a83-74bcccf25345-kube-api-access-t2f4l\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933358 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-config\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933372 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc5d4\" (UniqueName: \"kubernetes.io/projected/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-kube-api-access-wc5d4\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933387 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ee6b454d-afd6-400e-8f72-1880a5485abf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933409 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933433 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-client-ca\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933454 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-serving-cert\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933477 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933503 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltcsc\" (UniqueName: \"kubernetes.io/projected/2f98df76-283e-4a40-8985-e876b83119ce-kube-api-access-ltcsc\") pod \"openshift-apiserver-operator-796bbdcf4f-wrz4k\" (UID: \"2f98df76-283e-4a40-8985-e876b83119ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933522 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pmcm\" (UniqueName: \"kubernetes.io/projected/c86dd83f-1362-49d9-aecb-9e86cb66ebcd-kube-api-access-7pmcm\") pod \"cluster-samples-operator-665b6dd947-l7pfj\" (UID: \"c86dd83f-1362-49d9-aecb-9e86cb66ebcd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933539 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcgck\" (UniqueName: \"kubernetes.io/projected/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-kube-api-access-xcgck\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933554 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d4464e7-c998-48f2-bac7-bf0da585931e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933573 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-machine-approver-tls\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933589 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/817d1626-d4a3-4df7-bbbd-0ae698936819-images\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d4464e7-c998-48f2-bac7-bf0da585931e-service-ca-bundle\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933675 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8fbs\" (UniqueName: \"kubernetes.io/projected/5e6e5780-d702-4c2e-9045-3e74bb98136a-kube-api-access-l8fbs\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933690 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/817d1626-d4a3-4df7-bbbd-0ae698936819-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933706 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-etcd-serving-ca\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcq6c\" (UniqueName: \"kubernetes.io/projected/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-kube-api-access-lcq6c\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-auth-proxy-config\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-image-import-ca\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933776 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee6b454d-afd6-400e-8f72-1880a5485abf-serving-cert\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933793 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ee6b454d-afd6-400e-8f72-1880a5485abf-etcd-client\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933824 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-config\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933843 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/76012636-5dea-475a-bc3e-bdcac5a79760-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933868 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9eaed9c6-6995-4062-8c6b-a41853220149-serving-cert\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933882 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee6b454d-afd6-400e-8f72-1880a5485abf-audit-policies\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933899 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fhg6\" (UniqueName: \"kubernetes.io/projected/9d4464e7-c998-48f2-bac7-bf0da585931e-kube-api-access-9fhg6\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933915 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5925c888-ce34-47a0-aa48-ee913adef673-trusted-ca\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933931 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5925c888-ce34-47a0-aa48-ee913adef673-config\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933947 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-policies\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933966 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933982 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ee6b454d-afd6-400e-8f72-1880a5485abf-encryption-config\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.933996 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f98df76-283e-4a40-8985-e876b83119ce-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wrz4k\" (UID: \"2f98df76-283e-4a40-8985-e876b83119ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934012 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-encryption-config\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934029 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2rqw\" (UniqueName: \"kubernetes.io/projected/817d1626-d4a3-4df7-bbbd-0ae698936819-kube-api-access-g2rqw\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934045 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-config\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934061 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7d4b69-3286-49c0-8a83-74bcccf25345-serving-cert\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934077 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934093 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934110 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934126 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934142 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9d2p\" (UniqueName: \"kubernetes.io/projected/9eaed9c6-6995-4062-8c6b-a41853220149-kube-api-access-f9d2p\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934158 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-service-ca\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934171 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934188 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817d1626-d4a3-4df7-bbbd-0ae698936819-config\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934204 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-etcd-client\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934220 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-oauth-serving-cert\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934235 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-audit\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76012636-5dea-475a-bc3e-bdcac5a79760-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934282 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76012636-5dea-475a-bc3e-bdcac5a79760-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934301 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c86dd83f-1362-49d9-aecb-9e86cb66ebcd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l7pfj\" (UID: \"c86dd83f-1362-49d9-aecb-9e86cb66ebcd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee6b454d-afd6-400e-8f72-1880a5485abf-audit-dir\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934335 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934472 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934572 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934689 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934788 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934901 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.935007 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.935150 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.935493 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-config\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.935967 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936146 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-client-ca\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.934336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936311 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936333 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-oauth-config\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936350 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-node-pullsecrets\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936366 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zpps\" (UniqueName: \"kubernetes.io/projected/76012636-5dea-475a-bc3e-bdcac5a79760-kube-api-access-2zpps\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936381 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee6b454d-afd6-400e-8f72-1880a5485abf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936403 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936419 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4464e7-c998-48f2-bac7-bf0da585931e-serving-cert\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936433 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cd65\" (UniqueName: \"kubernetes.io/projected/5925c888-ce34-47a0-aa48-ee913adef673-kube-api-access-9cd65\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936448 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-config\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.936462 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5925c888-ce34-47a0-aa48-ee913adef673-serving-cert\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.937386 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.937901 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.938090 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jqx2k"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.938643 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.938978 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.943488 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.943516 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.944116 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.944181 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.944319 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.944470 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.944666 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.946038 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.946137 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9eaed9c6-6995-4062-8c6b-a41853220149-serving-cert\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.946998 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv"] Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.959197 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c86dd83f-1362-49d9-aecb-9e86cb66ebcd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l7pfj\" (UID: \"c86dd83f-1362-49d9-aecb-9e86cb66ebcd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.959507 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.959997 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.960028 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.960116 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.960673 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.960694 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.960859 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.960874 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.961480 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.962977 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.965939 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.966113 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.966258 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.966361 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.966454 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.966546 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.966666 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.966756 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.966859 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.966944 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.967039 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.967123 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 13:37:56 crc kubenswrapper[4763]: I0930 13:37:56.967213 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.014286 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.016047 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.020257 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jmpjx"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.021800 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.023027 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.026329 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.027317 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.027618 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.028206 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.028590 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pmcm\" (UniqueName: \"kubernetes.io/projected/c86dd83f-1362-49d9-aecb-9e86cb66ebcd-kube-api-access-7pmcm\") pod \"cluster-samples-operator-665b6dd947-l7pfj\" (UID: \"c86dd83f-1362-49d9-aecb-9e86cb66ebcd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.028770 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.028846 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.029391 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.029458 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.029638 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.029724 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.029985 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.030224 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-f228d"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.030559 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f7g5q"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.030585 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-88dg6"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.030866 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f228d" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.030997 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.031147 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.031586 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.031901 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.032937 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.033145 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.033397 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.033669 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.034027 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6ts49"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.034076 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.034325 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.034372 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dqjfv"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.034698 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.034933 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.035180 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.037309 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.037411 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-serving-cert\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.037498 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.037570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnng\" (UniqueName: \"kubernetes.io/projected/eb3a4cfd-1db3-488c-ba4c-bd04add6bd05-kube-api-access-smnng\") pod \"downloads-7954f5f757-f228d\" (UID: \"eb3a4cfd-1db3-488c-ba4c-bd04add6bd05\") " pod="openshift-console/downloads-7954f5f757-f228d" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.037677 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltcsc\" (UniqueName: \"kubernetes.io/projected/2f98df76-283e-4a40-8985-e876b83119ce-kube-api-access-ltcsc\") pod \"openshift-apiserver-operator-796bbdcf4f-wrz4k\" (UID: \"2f98df76-283e-4a40-8985-e876b83119ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.037746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcgck\" (UniqueName: \"kubernetes.io/projected/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-kube-api-access-xcgck\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.037812 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d4464e7-c998-48f2-bac7-bf0da585931e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.037877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-machine-approver-tls\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.037946 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/817d1626-d4a3-4df7-bbbd-0ae698936819-images\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.038011 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d4464e7-c998-48f2-bac7-bf0da585931e-service-ca-bundle\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.038078 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8fbs\" (UniqueName: \"kubernetes.io/projected/5e6e5780-d702-4c2e-9045-3e74bb98136a-kube-api-access-l8fbs\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.038143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/817d1626-d4a3-4df7-bbbd-0ae698936819-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.038212 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-etcd-serving-ca\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.038353 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcq6c\" (UniqueName: \"kubernetes.io/projected/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-kube-api-access-lcq6c\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.038419 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-auth-proxy-config\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.043832 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ckscg\" (UID: \"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.043949 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-image-import-ca\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.044026 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee6b454d-afd6-400e-8f72-1880a5485abf-serving-cert\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.044090 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ee6b454d-afd6-400e-8f72-1880a5485abf-etcd-client\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.044174 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-config\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.044241 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/76012636-5dea-475a-bc3e-bdcac5a79760-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.041310 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d4464e7-c998-48f2-bac7-bf0da585931e-service-ca-bundle\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.037985 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.043398 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-auth-proxy-config\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.040488 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/817d1626-d4a3-4df7-bbbd-0ae698936819-images\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.039029 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.044321 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee6b454d-afd6-400e-8f72-1880a5485abf-audit-policies\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.045862 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fhg6\" (UniqueName: \"kubernetes.io/projected/9d4464e7-c998-48f2-bac7-bf0da585931e-kube-api-access-9fhg6\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.045893 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5925c888-ce34-47a0-aa48-ee913adef673-trusted-ca\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.045917 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5925c888-ce34-47a0-aa48-ee913adef673-config\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.045936 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-policies\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.045956 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ckscg\" (UID: \"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.045984 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046001 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ee6b454d-afd6-400e-8f72-1880a5485abf-encryption-config\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046017 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f98df76-283e-4a40-8985-e876b83119ce-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wrz4k\" (UID: \"2f98df76-283e-4a40-8985-e876b83119ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046033 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-encryption-config\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2rqw\" (UniqueName: \"kubernetes.io/projected/817d1626-d4a3-4df7-bbbd-0ae698936819-kube-api-access-g2rqw\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046066 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-config\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046082 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/589a627b-9da3-4db1-80b4-93c2e444bd17-serving-cert\") pod \"openshift-config-operator-7777fb866f-8gtwq\" (UID: \"589a627b-9da3-4db1-80b4-93c2e444bd17\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046102 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7d4b69-3286-49c0-8a83-74bcccf25345-serving-cert\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046120 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046137 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046157 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046183 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9d2p\" (UniqueName: \"kubernetes.io/projected/9eaed9c6-6995-4062-8c6b-a41853220149-kube-api-access-f9d2p\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-service-ca\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046224 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046239 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817d1626-d4a3-4df7-bbbd-0ae698936819-config\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046254 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ckscg\" (UID: \"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046269 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/589a627b-9da3-4db1-80b4-93c2e444bd17-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8gtwq\" (UID: \"589a627b-9da3-4db1-80b4-93c2e444bd17\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046287 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-etcd-client\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046303 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-oauth-serving-cert\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046320 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046336 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-audit\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046351 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76012636-5dea-475a-bc3e-bdcac5a79760-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046365 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76012636-5dea-475a-bc3e-bdcac5a79760-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046387 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee6b454d-afd6-400e-8f72-1880a5485abf-audit-dir\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046404 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046421 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046437 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046455 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94s87\" (UniqueName: \"kubernetes.io/projected/589a627b-9da3-4db1-80b4-93c2e444bd17-kube-api-access-94s87\") pod \"openshift-config-operator-7777fb866f-8gtwq\" (UID: \"589a627b-9da3-4db1-80b4-93c2e444bd17\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046473 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-oauth-config\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046492 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-node-pullsecrets\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046507 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zpps\" (UniqueName: \"kubernetes.io/projected/76012636-5dea-475a-bc3e-bdcac5a79760-kube-api-access-2zpps\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046524 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee6b454d-afd6-400e-8f72-1880a5485abf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046544 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046562 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4464e7-c998-48f2-bac7-bf0da585931e-serving-cert\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046576 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cd65\" (UniqueName: \"kubernetes.io/projected/5925c888-ce34-47a0-aa48-ee913adef673-kube-api-access-9cd65\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046614 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-config\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046630 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5925c888-ce34-47a0-aa48-ee913adef673-serving-cert\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046646 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-client-ca\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046662 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-dir\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046677 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-serving-cert\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046695 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f98df76-283e-4a40-8985-e876b83119ce-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wrz4k\" (UID: \"2f98df76-283e-4a40-8985-e876b83119ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046713 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5khf\" (UniqueName: \"kubernetes.io/projected/ee6b454d-afd6-400e-8f72-1880a5485abf-kube-api-access-w5khf\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046732 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-trusted-ca-bundle\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-audit-dir\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046762 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4464e7-c998-48f2-bac7-bf0da585931e-config\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046797 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2f4l\" (UniqueName: \"kubernetes.io/projected/9c7d4b69-3286-49c0-8a83-74bcccf25345-kube-api-access-t2f4l\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046813 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-config\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046828 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc5d4\" (UniqueName: \"kubernetes.io/projected/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-kube-api-access-wc5d4\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.046843 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ee6b454d-afd6-400e-8f72-1880a5485abf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.049126 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-machine-approver-tls\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.049559 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee6b454d-afd6-400e-8f72-1880a5485abf-audit-dir\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.049890 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5925c888-ce34-47a0-aa48-ee913adef673-trusted-ca\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.049928 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d4464e7-c998-48f2-bac7-bf0da585931e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.050092 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.050460 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5925c888-ce34-47a0-aa48-ee913adef673-config\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.050700 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-policies\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.051034 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ee6b454d-afd6-400e-8f72-1880a5485abf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.051176 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-image-import-ca\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.051297 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee6b454d-afd6-400e-8f72-1880a5485abf-audit-policies\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.051391 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.042581 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.051960 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-config\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.057466 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.042645 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.059219 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee6b454d-afd6-400e-8f72-1880a5485abf-serving-cert\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.061732 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.063351 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.064494 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.064731 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-client-ca\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.064796 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-dir\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.041788 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-etcd-serving-ca\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.064860 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-audit-dir\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.065440 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f98df76-283e-4a40-8985-e876b83119ce-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wrz4k\" (UID: \"2f98df76-283e-4a40-8985-e876b83119ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.066030 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.075792 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.067544 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.076123 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.064771 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-node-pullsecrets\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.076849 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.067521 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee6b454d-afd6-400e-8f72-1880a5485abf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.077167 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gdlbd"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.077473 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.077761 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-oauth-config\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.078307 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/817d1626-d4a3-4df7-bbbd-0ae698936819-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.078529 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76012636-5dea-475a-bc3e-bdcac5a79760-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.078839 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-config\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.079111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-serving-cert\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.079627 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-encryption-config\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.080697 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-trusted-ca-bundle\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.081119 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.083027 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.083257 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.083378 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.083534 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-audit\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.083654 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4464e7-c998-48f2-bac7-bf0da585931e-config\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.083918 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.085182 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-oauth-serving-cert\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.085210 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-config\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.085473 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5925c888-ce34-47a0-aa48-ee913adef673-serving-cert\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.077469 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/76012636-5dea-475a-bc3e-bdcac5a79760-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.085753 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.085917 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-config\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.086021 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.086730 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817d1626-d4a3-4df7-bbbd-0ae698936819-config\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.087074 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-service-ca\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.087680 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ee6b454d-afd6-400e-8f72-1880a5485abf-encryption-config\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.087992 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.088241 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.089706 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4464e7-c998-48f2-bac7-bf0da585931e-serving-cert\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.089724 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f98df76-283e-4a40-8985-e876b83119ce-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wrz4k\" (UID: \"2f98df76-283e-4a40-8985-e876b83119ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.090965 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m47xm"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.091184 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.091647 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.092401 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.092844 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.092894 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.093581 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ldk4q"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.094674 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.094717 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.096202 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.096357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.096733 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.097255 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.097532 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.098825 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zt76t"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.099672 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.103461 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.104266 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.104515 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.106059 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-etcd-client\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.106582 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.109784 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.110216 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.111273 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.111299 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.111310 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7czqk"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.112158 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.112552 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.112856 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.112933 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.113308 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.114619 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-p5rvt"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.116152 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-serving-cert\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.121498 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7d4b69-3286-49c0-8a83-74bcccf25345-serving-cert\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.115967 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-64zqx"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.121592 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vjdsr"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.121621 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.121633 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.122852 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-55k7x"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.123572 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-55k7x" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.124588 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.125565 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ee6b454d-afd6-400e-8f72-1880a5485abf-etcd-client\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.126427 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f228d"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.127425 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.128823 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.130377 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.131455 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.131716 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.132732 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jmpjx"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.134096 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.135344 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zt76t"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.137705 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-88dg6"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.137906 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7czqk"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.139325 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-59chv"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.141031 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ldk4q"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.141112 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.142042 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.143392 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-55k7x"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.144901 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.146373 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.152317 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ckscg\" (UID: \"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.152407 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ckscg\" (UID: \"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.152440 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/589a627b-9da3-4db1-80b4-93c2e444bd17-serving-cert\") pod \"openshift-config-operator-7777fb866f-8gtwq\" (UID: \"589a627b-9da3-4db1-80b4-93c2e444bd17\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.152479 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ckscg\" (UID: \"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.152500 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/589a627b-9da3-4db1-80b4-93c2e444bd17-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8gtwq\" (UID: \"589a627b-9da3-4db1-80b4-93c2e444bd17\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.152537 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94s87\" (UniqueName: \"kubernetes.io/projected/589a627b-9da3-4db1-80b4-93c2e444bd17-kube-api-access-94s87\") pod \"openshift-config-operator-7777fb866f-8gtwq\" (UID: \"589a627b-9da3-4db1-80b4-93c2e444bd17\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.152643 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnng\" (UniqueName: \"kubernetes.io/projected/eb3a4cfd-1db3-488c-ba4c-bd04add6bd05-kube-api-access-smnng\") pod \"downloads-7954f5f757-f228d\" (UID: \"eb3a4cfd-1db3-488c-ba4c-bd04add6bd05\") " pod="openshift-console/downloads-7954f5f757-f228d" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.155184 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.155537 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/589a627b-9da3-4db1-80b4-93c2e444bd17-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8gtwq\" (UID: \"589a627b-9da3-4db1-80b4-93c2e444bd17\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.156162 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ckscg\" (UID: \"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.156170 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m47xm"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.157833 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.159177 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/589a627b-9da3-4db1-80b4-93c2e444bd17-serving-cert\") pod \"openshift-config-operator-7777fb866f-8gtwq\" (UID: \"589a627b-9da3-4db1-80b4-93c2e444bd17\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.159796 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.161564 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ckscg\" (UID: \"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.162205 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.163500 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dqjfv"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.164561 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.165949 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-59chv"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.167367 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bdwgk"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.168183 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.168725 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bdwgk"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.171322 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.192719 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.206671 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.222048 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pfmnw"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.222670 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.231692 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.251888 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.272566 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.312158 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.312925 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.350082 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.352940 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.372221 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.393818 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.412826 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.432519 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.451705 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.474067 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.517706 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8fbs\" (UniqueName: \"kubernetes.io/projected/5e6e5780-d702-4c2e-9045-3e74bb98136a-kube-api-access-l8fbs\") pod \"oauth-openshift-558db77b4-vjdsr\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.534110 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcgck\" (UniqueName: \"kubernetes.io/projected/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-kube-api-access-xcgck\") pod \"console-f9d7485db-p5rvt\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.548417 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltcsc\" (UniqueName: \"kubernetes.io/projected/2f98df76-283e-4a40-8985-e876b83119ce-kube-api-access-ltcsc\") pod \"openshift-apiserver-operator-796bbdcf4f-wrz4k\" (UID: \"2f98df76-283e-4a40-8985-e876b83119ce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.558906 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-bound-sa-token\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.558963 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db706ef4-d8fb-438b-96e5-7fef497272a0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kq9tr\" (UID: \"db706ef4-d8fb-438b-96e5-7fef497272a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.558999 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db706ef4-d8fb-438b-96e5-7fef497272a0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kq9tr\" (UID: \"db706ef4-d8fb-438b-96e5-7fef497272a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.559059 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vp8l\" (UniqueName: \"kubernetes.io/projected/0e1dec7f-15b7-44fd-8905-9084995950c2-kube-api-access-5vp8l\") pod \"multus-admission-controller-857f4d67dd-88dg6\" (UID: \"0e1dec7f-15b7-44fd-8905-9084995950c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.559163 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbgf\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-kube-api-access-spbgf\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.559200 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.559239 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-tls\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.559517 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.559818 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.559940 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97399be7-e1a2-4803-b32d-3c2490e98204-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nkszw\" (UID: \"97399be7-e1a2-4803-b32d-3c2490e98204\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:57 crc kubenswrapper[4763]: E0930 13:37:57.560228 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.060208977 +0000 UTC m=+150.198769282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.560645 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e1dec7f-15b7-44fd-8905-9084995950c2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-88dg6\" (UID: \"0e1dec7f-15b7-44fd-8905-9084995950c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.560774 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97399be7-e1a2-4803-b32d-3c2490e98204-config\") pod \"kube-controller-manager-operator-78b949d7b-nkszw\" (UID: \"97399be7-e1a2-4803-b32d-3c2490e98204\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.560958 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db706ef4-d8fb-438b-96e5-7fef497272a0-config\") pod \"kube-apiserver-operator-766d6c64bb-kq9tr\" (UID: \"db706ef4-d8fb-438b-96e5-7fef497272a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.561113 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97399be7-e1a2-4803-b32d-3c2490e98204-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nkszw\" (UID: \"97399be7-e1a2-4803-b32d-3c2490e98204\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.561199 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-trusted-ca\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.561300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-certificates\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.569103 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcq6c\" (UniqueName: \"kubernetes.io/projected/f2c347bc-ec1b-4ead-b9c8-f8a3443c2322-kube-api-access-lcq6c\") pod \"apiserver-76f77b778f-6ts49\" (UID: \"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322\") " pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.586327 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fhg6\" (UniqueName: \"kubernetes.io/projected/9d4464e7-c998-48f2-bac7-bf0da585931e-kube-api-access-9fhg6\") pod \"authentication-operator-69f744f599-64zqx\" (UID: \"9d4464e7-c998-48f2-bac7-bf0da585931e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.606116 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cd65\" (UniqueName: \"kubernetes.io/projected/5925c888-ce34-47a0-aa48-ee913adef673-kube-api-access-9cd65\") pod \"console-operator-58897d9998-f7g5q\" (UID: \"5925c888-ce34-47a0-aa48-ee913adef673\") " pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.627009 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76012636-5dea-475a-bc3e-bdcac5a79760-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.635121 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.649498 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9d2p\" (UniqueName: \"kubernetes.io/projected/9eaed9c6-6995-4062-8c6b-a41853220149-kube-api-access-f9d2p\") pod \"controller-manager-879f6c89f-9rcjp\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.661912 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662049 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2nsb\" (UniqueName: \"kubernetes.io/projected/8920a637-88bb-4ecd-b699-7f66dd955746-kube-api-access-c2nsb\") pod \"machine-config-controller-84d6567774-ndxvt\" (UID: \"8920a637-88bb-4ecd-b699-7f66dd955746\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662070 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b20ac013-c0be-4b7a-b5a8-cd6db89814ee-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms8hj\" (UID: \"b20ac013-c0be-4b7a-b5a8-cd6db89814ee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662089 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/66560dbc-ad52-44b6-ad49-a8c83e403714-signing-cabundle\") pod \"service-ca-9c57cc56f-ldk4q\" (UID: \"66560dbc-ad52-44b6-ad49-a8c83e403714\") " pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662109 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a540445-8589-4437-b134-38ba9d38faf0-secret-volume\") pod \"collect-profiles-29320650-kpklw\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662124 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/318791e7-bdca-4695-9910-b9162ff85baf-metrics-tls\") pod \"dns-operator-744455d44c-m47xm\" (UID: \"318791e7-bdca-4695-9910-b9162ff85baf\") " pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662150 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srz4b\" (UniqueName: \"kubernetes.io/projected/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-kube-api-access-srz4b\") pod \"marketplace-operator-79b997595-dqjfv\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662200 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqvbq\" (UniqueName: \"kubernetes.io/projected/d11299ad-6339-4844-8031-d517a5535b1b-kube-api-access-mqvbq\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662226 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a540445-8589-4437-b134-38ba9d38faf0-config-volume\") pod \"collect-profiles-29320650-kpklw\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjp4x\" (UniqueName: \"kubernetes.io/projected/dfb46cdc-6bb1-48a9-a80f-3330b65f96cf-kube-api-access-fjp4x\") pod \"olm-operator-6b444d44fb-g4x6h\" (UID: \"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662269 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ltnp\" (UniqueName: \"kubernetes.io/projected/ee65800a-94a4-43c1-a5bf-3a7b889619d8-kube-api-access-6ltnp\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662285 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zq6k\" (UniqueName: \"kubernetes.io/projected/e350689d-81e5-4fc9-a346-b57a553f39fd-kube-api-access-2zq6k\") pod \"ingress-canary-55k7x\" (UID: \"e350689d-81e5-4fc9-a346-b57a553f39fd\") " pod="openshift-ingress-canary/ingress-canary-55k7x" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662302 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n84z7\" (UniqueName: \"kubernetes.io/projected/63d0c459-8206-47bc-991f-2c3a1ed20a4f-kube-api-access-n84z7\") pod \"openshift-controller-manager-operator-756b6f6bc6-7jfqq\" (UID: \"63d0c459-8206-47bc-991f-2c3a1ed20a4f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662317 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d0c459-8206-47bc-991f-2c3a1ed20a4f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7jfqq\" (UID: \"63d0c459-8206-47bc-991f-2c3a1ed20a4f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662331 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a660fb91-7237-4d7a-9c8b-643f63bd589f-config\") pod \"service-ca-operator-777779d784-zt76t\" (UID: \"a660fb91-7237-4d7a-9c8b-643f63bd589f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662346 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-metrics-certs\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662364 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fldhw\" (UniqueName: \"kubernetes.io/projected/1f7a55a5-5e61-465e-927b-08cc7e8884a2-kube-api-access-fldhw\") pod \"kube-storage-version-migrator-operator-b67b599dd-2qnwb\" (UID: \"1f7a55a5-5e61-465e-927b-08cc7e8884a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662383 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-certs\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662410 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-registration-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662441 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee65800a-94a4-43c1-a5bf-3a7b889619d8-metrics-tls\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662473 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-webhook-cert\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662511 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db706ef4-d8fb-438b-96e5-7fef497272a0-config\") pod \"kube-apiserver-operator-766d6c64bb-kq9tr\" (UID: \"db706ef4-d8fb-438b-96e5-7fef497272a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662527 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97399be7-e1a2-4803-b32d-3c2490e98204-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nkszw\" (UID: \"97399be7-e1a2-4803-b32d-3c2490e98204\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662544 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-certificates\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662561 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-trusted-ca\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8920a637-88bb-4ecd-b699-7f66dd955746-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ndxvt\" (UID: \"8920a637-88bb-4ecd-b699-7f66dd955746\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662593 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-socket-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662643 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-node-bootstrap-token\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/66560dbc-ad52-44b6-ad49-a8c83e403714-signing-key\") pod \"service-ca-9c57cc56f-ldk4q\" (UID: \"66560dbc-ad52-44b6-ad49-a8c83e403714\") " pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662722 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-apiservice-cert\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662740 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db706ef4-d8fb-438b-96e5-7fef497272a0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kq9tr\" (UID: \"db706ef4-d8fb-438b-96e5-7fef497272a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-stats-auth\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662772 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e350689d-81e5-4fc9-a346-b57a553f39fd-cert\") pod \"ingress-canary-55k7x\" (UID: \"e350689d-81e5-4fc9-a346-b57a553f39fd\") " pod="openshift-ingress-canary/ingress-canary-55k7x" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dfb46cdc-6bb1-48a9-a80f-3330b65f96cf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g4x6h\" (UID: \"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662835 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt68q\" (UniqueName: \"kubernetes.io/projected/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-kube-api-access-wt68q\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662859 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlxsj\" (UniqueName: \"kubernetes.io/projected/318791e7-bdca-4695-9910-b9162ff85baf-kube-api-access-wlxsj\") pod \"dns-operator-744455d44c-m47xm\" (UID: \"318791e7-bdca-4695-9910-b9162ff85baf\") " pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662883 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dqjfv\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662915 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a660fb91-7237-4d7a-9c8b-643f63bd589f-serving-cert\") pod \"service-ca-operator-777779d784-zt76t\" (UID: \"a660fb91-7237-4d7a-9c8b-643f63bd589f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662969 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-default-certificate\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.662999 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-tls\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663014 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-config\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663030 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-serving-cert\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663045 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dqjfv\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663059 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-tmpfs\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663092 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8920a637-88bb-4ecd-b699-7f66dd955746-proxy-tls\") pod \"machine-config-controller-84d6567774-ndxvt\" (UID: \"8920a637-88bb-4ecd-b699-7f66dd955746\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663122 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pvq8\" (UniqueName: \"kubernetes.io/projected/a660fb91-7237-4d7a-9c8b-643f63bd589f-kube-api-access-6pvq8\") pod \"service-ca-operator-777779d784-zt76t\" (UID: \"a660fb91-7237-4d7a-9c8b-643f63bd589f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663191 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-mountpoint-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663220 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee65800a-94a4-43c1-a5bf-3a7b889619d8-trusted-ca\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663238 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65jt9\" (UniqueName: \"kubernetes.io/projected/0c3003e9-34dc-492f-8cd2-ca6851841117-kube-api-access-65jt9\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663252 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzbks\" (UniqueName: \"kubernetes.io/projected/a2444e42-08b2-4e35-ae89-e2666a8fd3b6-kube-api-access-hzbks\") pod \"package-server-manager-789f6589d5-ddx8p\" (UID: \"a2444e42-08b2-4e35-ae89-e2666a8fd3b6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhwtj\" (UniqueName: \"kubernetes.io/projected/b20ac013-c0be-4b7a-b5a8-cd6db89814ee-kube-api-access-nhwtj\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms8hj\" (UID: \"b20ac013-c0be-4b7a-b5a8-cd6db89814ee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97399be7-e1a2-4803-b32d-3c2490e98204-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nkszw\" (UID: \"97399be7-e1a2-4803-b32d-3c2490e98204\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663349 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-plugins-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663374 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/85d6ec09-077d-4958-b8c7-d09dd9c45e29-images\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvl9\" (UniqueName: \"kubernetes.io/projected/66560dbc-ad52-44b6-ad49-a8c83e403714-kube-api-access-4rvl9\") pod \"service-ca-9c57cc56f-ldk4q\" (UID: \"66560dbc-ad52-44b6-ad49-a8c83e403714\") " pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663423 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7a55a5-5e61-465e-927b-08cc7e8884a2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2qnwb\" (UID: \"1f7a55a5-5e61-465e-927b-08cc7e8884a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663443 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-service-ca\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663462 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-etcd-client\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663492 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e1dec7f-15b7-44fd-8905-9084995950c2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-88dg6\" (UID: \"0e1dec7f-15b7-44fd-8905-9084995950c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663509 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb5fc3ab-ae64-4fea-ba4f-010a25c6791c-srv-cert\") pod \"catalog-operator-68c6474976-xbmq9\" (UID: \"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4jgt\" (UniqueName: \"kubernetes.io/projected/eb5fc3ab-ae64-4fea-ba4f-010a25c6791c-kube-api-access-v4jgt\") pod \"catalog-operator-68c6474976-xbmq9\" (UID: \"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663540 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-service-ca-bundle\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97399be7-e1a2-4803-b32d-3c2490e98204-config\") pod \"kube-controller-manager-operator-78b949d7b-nkszw\" (UID: \"97399be7-e1a2-4803-b32d-3c2490e98204\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663572 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85d6ec09-077d-4958-b8c7-d09dd9c45e29-proxy-tls\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663620 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb5fc3ab-ae64-4fea-ba4f-010a25c6791c-profile-collector-cert\") pod \"catalog-operator-68c6474976-xbmq9\" (UID: \"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663643 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dfb46cdc-6bb1-48a9-a80f-3330b65f96cf-srv-cert\") pod \"olm-operator-6b444d44fb-g4x6h\" (UID: \"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663674 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tplb6\" (UniqueName: \"kubernetes.io/projected/6a540445-8589-4437-b134-38ba9d38faf0-kube-api-access-tplb6\") pod \"collect-profiles-29320650-kpklw\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663689 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-ca\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663707 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85d6ec09-077d-4958-b8c7-d09dd9c45e29-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663722 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-bound-sa-token\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663737 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db706ef4-d8fb-438b-96e5-7fef497272a0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kq9tr\" (UID: \"db706ef4-d8fb-438b-96e5-7fef497272a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663754 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnmwm\" (UniqueName: \"kubernetes.io/projected/97702a69-e0ad-47b9-b8b7-d32fadc9185e-kube-api-access-jnmwm\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663768 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-csi-data-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663784 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2444e42-08b2-4e35-ae89-e2666a8fd3b6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ddx8p\" (UID: \"a2444e42-08b2-4e35-ae89-e2666a8fd3b6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-747qm\" (UniqueName: \"kubernetes.io/projected/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-kube-api-access-747qm\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663825 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vp8l\" (UniqueName: \"kubernetes.io/projected/0e1dec7f-15b7-44fd-8905-9084995950c2-kube-api-access-5vp8l\") pod \"multus-admission-controller-857f4d67dd-88dg6\" (UID: \"0e1dec7f-15b7-44fd-8905-9084995950c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663840 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee65800a-94a4-43c1-a5bf-3a7b889619d8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663856 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qrh4\" (UniqueName: \"kubernetes.io/projected/7c634595-a271-4ae0-8477-39ef345aa87b-kube-api-access-4qrh4\") pod \"migrator-59844c95c7-w2tnd\" (UID: \"7c634595-a271-4ae0-8477-39ef345aa87b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbgf\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-kube-api-access-spbgf\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663888 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663904 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw5ff\" (UniqueName: \"kubernetes.io/projected/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-kube-api-access-fw5ff\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663919 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b59zj\" (UniqueName: \"kubernetes.io/projected/85d6ec09-077d-4958-b8c7-d09dd9c45e29-kube-api-access-b59zj\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663933 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7a55a5-5e61-465e-927b-08cc7e8884a2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2qnwb\" (UID: \"1f7a55a5-5e61-465e-927b-08cc7e8884a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663953 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63d0c459-8206-47bc-991f-2c3a1ed20a4f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7jfqq\" (UID: \"63d0c459-8206-47bc-991f-2c3a1ed20a4f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663968 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-config-volume\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663982 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-metrics-tls\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.663998 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: E0930 13:37:57.664344 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.164326445 +0000 UTC m=+150.302886740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.666008 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db706ef4-d8fb-438b-96e5-7fef497272a0-config\") pod \"kube-apiserver-operator-766d6c64bb-kq9tr\" (UID: \"db706ef4-d8fb-438b-96e5-7fef497272a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.666972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-certificates\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.667908 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-trusted-ca\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.670573 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.672070 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db706ef4-d8fb-438b-96e5-7fef497272a0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kq9tr\" (UID: \"db706ef4-d8fb-438b-96e5-7fef497272a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.672152 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97399be7-e1a2-4803-b32d-3c2490e98204-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nkszw\" (UID: \"97399be7-e1a2-4803-b32d-3c2490e98204\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.672285 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97399be7-e1a2-4803-b32d-3c2490e98204-config\") pod \"kube-controller-manager-operator-78b949d7b-nkszw\" (UID: \"97399be7-e1a2-4803-b32d-3c2490e98204\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.672471 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e1dec7f-15b7-44fd-8905-9084995950c2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-88dg6\" (UID: \"0e1dec7f-15b7-44fd-8905-9084995950c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.672836 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-tls\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.680745 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2rqw\" (UniqueName: \"kubernetes.io/projected/817d1626-d4a3-4df7-bbbd-0ae698936819-kube-api-access-g2rqw\") pod \"machine-api-operator-5694c8668f-jqx2k\" (UID: \"817d1626-d4a3-4df7-bbbd-0ae698936819\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.683163 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.688573 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc5d4\" (UniqueName: \"kubernetes.io/projected/4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b-kube-api-access-wc5d4\") pod \"machine-approver-56656f9798-x2jhk\" (UID: \"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.689929 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.692101 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.696010 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.725900 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.727000 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.728647 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.733677 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.742109 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.753126 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.761412 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85d6ec09-077d-4958-b8c7-d09dd9c45e29-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765274 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnmwm\" (UniqueName: \"kubernetes.io/projected/97702a69-e0ad-47b9-b8b7-d32fadc9185e-kube-api-access-jnmwm\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765309 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-csi-data-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765475 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2444e42-08b2-4e35-ae89-e2666a8fd3b6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ddx8p\" (UID: \"a2444e42-08b2-4e35-ae89-e2666a8fd3b6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-csi-data-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765682 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-747qm\" (UniqueName: \"kubernetes.io/projected/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-kube-api-access-747qm\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765766 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee65800a-94a4-43c1-a5bf-3a7b889619d8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765795 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qrh4\" (UniqueName: \"kubernetes.io/projected/7c634595-a271-4ae0-8477-39ef345aa87b-kube-api-access-4qrh4\") pod \"migrator-59844c95c7-w2tnd\" (UID: \"7c634595-a271-4ae0-8477-39ef345aa87b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765831 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw5ff\" (UniqueName: \"kubernetes.io/projected/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-kube-api-access-fw5ff\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765855 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b59zj\" (UniqueName: \"kubernetes.io/projected/85d6ec09-077d-4958-b8c7-d09dd9c45e29-kube-api-access-b59zj\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765886 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7a55a5-5e61-465e-927b-08cc7e8884a2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2qnwb\" (UID: \"1f7a55a5-5e61-465e-927b-08cc7e8884a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765927 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63d0c459-8206-47bc-991f-2c3a1ed20a4f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7jfqq\" (UID: \"63d0c459-8206-47bc-991f-2c3a1ed20a4f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765955 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-config-volume\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.765976 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-metrics-tls\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766007 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2nsb\" (UniqueName: \"kubernetes.io/projected/8920a637-88bb-4ecd-b699-7f66dd955746-kube-api-access-c2nsb\") pod \"machine-config-controller-84d6567774-ndxvt\" (UID: \"8920a637-88bb-4ecd-b699-7f66dd955746\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766036 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b20ac013-c0be-4b7a-b5a8-cd6db89814ee-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms8hj\" (UID: \"b20ac013-c0be-4b7a-b5a8-cd6db89814ee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766063 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/66560dbc-ad52-44b6-ad49-a8c83e403714-signing-cabundle\") pod \"service-ca-9c57cc56f-ldk4q\" (UID: \"66560dbc-ad52-44b6-ad49-a8c83e403714\") " pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766090 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a540445-8589-4437-b134-38ba9d38faf0-secret-volume\") pod \"collect-profiles-29320650-kpklw\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/318791e7-bdca-4695-9910-b9162ff85baf-metrics-tls\") pod \"dns-operator-744455d44c-m47xm\" (UID: \"318791e7-bdca-4695-9910-b9162ff85baf\") " pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srz4b\" (UniqueName: \"kubernetes.io/projected/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-kube-api-access-srz4b\") pod \"marketplace-operator-79b997595-dqjfv\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766178 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqvbq\" (UniqueName: \"kubernetes.io/projected/d11299ad-6339-4844-8031-d517a5535b1b-kube-api-access-mqvbq\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a540445-8589-4437-b134-38ba9d38faf0-config-volume\") pod \"collect-profiles-29320650-kpklw\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766275 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjp4x\" (UniqueName: \"kubernetes.io/projected/dfb46cdc-6bb1-48a9-a80f-3330b65f96cf-kube-api-access-fjp4x\") pod \"olm-operator-6b444d44fb-g4x6h\" (UID: \"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766309 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ltnp\" (UniqueName: \"kubernetes.io/projected/ee65800a-94a4-43c1-a5bf-3a7b889619d8-kube-api-access-6ltnp\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766331 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zq6k\" (UniqueName: \"kubernetes.io/projected/e350689d-81e5-4fc9-a346-b57a553f39fd-kube-api-access-2zq6k\") pod \"ingress-canary-55k7x\" (UID: \"e350689d-81e5-4fc9-a346-b57a553f39fd\") " pod="openshift-ingress-canary/ingress-canary-55k7x" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766369 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n84z7\" (UniqueName: \"kubernetes.io/projected/63d0c459-8206-47bc-991f-2c3a1ed20a4f-kube-api-access-n84z7\") pod \"openshift-controller-manager-operator-756b6f6bc6-7jfqq\" (UID: \"63d0c459-8206-47bc-991f-2c3a1ed20a4f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766400 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d0c459-8206-47bc-991f-2c3a1ed20a4f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7jfqq\" (UID: \"63d0c459-8206-47bc-991f-2c3a1ed20a4f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766422 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a660fb91-7237-4d7a-9c8b-643f63bd589f-config\") pod \"service-ca-operator-777779d784-zt76t\" (UID: \"a660fb91-7237-4d7a-9c8b-643f63bd589f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766456 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-metrics-certs\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766485 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fldhw\" (UniqueName: \"kubernetes.io/projected/1f7a55a5-5e61-465e-927b-08cc7e8884a2-kube-api-access-fldhw\") pod \"kube-storage-version-migrator-operator-b67b599dd-2qnwb\" (UID: \"1f7a55a5-5e61-465e-927b-08cc7e8884a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766508 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-certs\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766551 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-registration-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee65800a-94a4-43c1-a5bf-3a7b889619d8-metrics-tls\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766650 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-webhook-cert\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766695 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8920a637-88bb-4ecd-b699-7f66dd955746-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ndxvt\" (UID: \"8920a637-88bb-4ecd-b699-7f66dd955746\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766732 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-socket-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766755 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-node-bootstrap-token\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766778 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/66560dbc-ad52-44b6-ad49-a8c83e403714-signing-key\") pod \"service-ca-9c57cc56f-ldk4q\" (UID: \"66560dbc-ad52-44b6-ad49-a8c83e403714\") " pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766804 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-apiservice-cert\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766831 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-stats-auth\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766854 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e350689d-81e5-4fc9-a346-b57a553f39fd-cert\") pod \"ingress-canary-55k7x\" (UID: \"e350689d-81e5-4fc9-a346-b57a553f39fd\") " pod="openshift-ingress-canary/ingress-canary-55k7x" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766892 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dfb46cdc-6bb1-48a9-a80f-3330b65f96cf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g4x6h\" (UID: \"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766931 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt68q\" (UniqueName: \"kubernetes.io/projected/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-kube-api-access-wt68q\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766973 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlxsj\" (UniqueName: \"kubernetes.io/projected/318791e7-bdca-4695-9910-b9162ff85baf-kube-api-access-wlxsj\") pod \"dns-operator-744455d44c-m47xm\" (UID: \"318791e7-bdca-4695-9910-b9162ff85baf\") " pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.766976 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-registration-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767000 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dqjfv\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767022 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a660fb91-7237-4d7a-9c8b-643f63bd589f-serving-cert\") pod \"service-ca-operator-777779d784-zt76t\" (UID: \"a660fb91-7237-4d7a-9c8b-643f63bd589f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-default-certificate\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767078 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-config\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767096 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-serving-cert\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767119 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dqjfv\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-tmpfs\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767173 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8920a637-88bb-4ecd-b699-7f66dd955746-proxy-tls\") pod \"machine-config-controller-84d6567774-ndxvt\" (UID: \"8920a637-88bb-4ecd-b699-7f66dd955746\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767206 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pvq8\" (UniqueName: \"kubernetes.io/projected/a660fb91-7237-4d7a-9c8b-643f63bd589f-kube-api-access-6pvq8\") pod \"service-ca-operator-777779d784-zt76t\" (UID: \"a660fb91-7237-4d7a-9c8b-643f63bd589f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767235 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767241 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85d6ec09-077d-4958-b8c7-d09dd9c45e29-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767270 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-mountpoint-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767290 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee65800a-94a4-43c1-a5bf-3a7b889619d8-trusted-ca\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767313 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65jt9\" (UniqueName: \"kubernetes.io/projected/0c3003e9-34dc-492f-8cd2-ca6851841117-kube-api-access-65jt9\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767367 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzbks\" (UniqueName: \"kubernetes.io/projected/a2444e42-08b2-4e35-ae89-e2666a8fd3b6-kube-api-access-hzbks\") pod \"package-server-manager-789f6589d5-ddx8p\" (UID: \"a2444e42-08b2-4e35-ae89-e2666a8fd3b6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767394 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhwtj\" (UniqueName: \"kubernetes.io/projected/b20ac013-c0be-4b7a-b5a8-cd6db89814ee-kube-api-access-nhwtj\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms8hj\" (UID: \"b20ac013-c0be-4b7a-b5a8-cd6db89814ee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767418 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-plugins-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767471 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/85d6ec09-077d-4958-b8c7-d09dd9c45e29-images\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767498 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvl9\" (UniqueName: \"kubernetes.io/projected/66560dbc-ad52-44b6-ad49-a8c83e403714-kube-api-access-4rvl9\") pod \"service-ca-9c57cc56f-ldk4q\" (UID: \"66560dbc-ad52-44b6-ad49-a8c83e403714\") " pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767528 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7a55a5-5e61-465e-927b-08cc7e8884a2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2qnwb\" (UID: \"1f7a55a5-5e61-465e-927b-08cc7e8884a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767553 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-service-ca\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767575 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-etcd-client\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767617 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb5fc3ab-ae64-4fea-ba4f-010a25c6791c-srv-cert\") pod \"catalog-operator-68c6474976-xbmq9\" (UID: \"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767647 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4jgt\" (UniqueName: \"kubernetes.io/projected/eb5fc3ab-ae64-4fea-ba4f-010a25c6791c-kube-api-access-v4jgt\") pod \"catalog-operator-68c6474976-xbmq9\" (UID: \"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-service-ca-bundle\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85d6ec09-077d-4958-b8c7-d09dd9c45e29-proxy-tls\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767715 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb5fc3ab-ae64-4fea-ba4f-010a25c6791c-profile-collector-cert\") pod \"catalog-operator-68c6474976-xbmq9\" (UID: \"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767738 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dfb46cdc-6bb1-48a9-a80f-3330b65f96cf-srv-cert\") pod \"olm-operator-6b444d44fb-g4x6h\" (UID: \"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767772 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tplb6\" (UniqueName: \"kubernetes.io/projected/6a540445-8589-4437-b134-38ba9d38faf0-kube-api-access-tplb6\") pod \"collect-profiles-29320650-kpklw\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767793 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-ca\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.767855 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d0c459-8206-47bc-991f-2c3a1ed20a4f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7jfqq\" (UID: \"63d0c459-8206-47bc-991f-2c3a1ed20a4f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.768180 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-plugins-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.768197 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8920a637-88bb-4ecd-b699-7f66dd955746-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ndxvt\" (UID: \"8920a637-88bb-4ecd-b699-7f66dd955746\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.768276 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-socket-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: E0930 13:37:57.768760 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.268745329 +0000 UTC m=+150.407305774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.769038 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-tmpfs\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.769248 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/97702a69-e0ad-47b9-b8b7-d32fadc9185e-mountpoint-dir\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.770682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dqjfv\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.772556 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63d0c459-8206-47bc-991f-2c3a1ed20a4f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7jfqq\" (UID: \"63d0c459-8206-47bc-991f-2c3a1ed20a4f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.777852 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.777881 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b20ac013-c0be-4b7a-b5a8-cd6db89814ee-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms8hj\" (UID: \"b20ac013-c0be-4b7a-b5a8-cd6db89814ee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.784625 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dfb46cdc-6bb1-48a9-a80f-3330b65f96cf-srv-cert\") pod \"olm-operator-6b444d44fb-g4x6h\" (UID: \"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.785036 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dqjfv\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.795084 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.806188 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb5fc3ab-ae64-4fea-ba4f-010a25c6791c-srv-cert\") pod \"catalog-operator-68c6474976-xbmq9\" (UID: \"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.818273 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.823320 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.824103 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.825830 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb5fc3ab-ae64-4fea-ba4f-010a25c6791c-profile-collector-cert\") pod \"catalog-operator-68c6474976-xbmq9\" (UID: \"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.830824 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a540445-8589-4437-b134-38ba9d38faf0-secret-volume\") pod \"collect-profiles-29320650-kpklw\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.831382 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dfb46cdc-6bb1-48a9-a80f-3330b65f96cf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g4x6h\" (UID: \"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.832631 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.837171 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.853708 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.859181 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/85d6ec09-077d-4958-b8c7-d09dd9c45e29-images\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.868882 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:57 crc kubenswrapper[4763]: E0930 13:37:57.869718 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.369685083 +0000 UTC m=+150.508245368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.875309 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.896883 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.912556 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.922948 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vjdsr"] Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.926317 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85d6ec09-077d-4958-b8c7-d09dd9c45e29-proxy-tls\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.958156 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zpps\" (UniqueName: \"kubernetes.io/projected/76012636-5dea-475a-bc3e-bdcac5a79760-kube-api-access-2zpps\") pod \"cluster-image-registry-operator-dc59b4c8b-lvdnm\" (UID: \"76012636-5dea-475a-bc3e-bdcac5a79760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.969560 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2f4l\" (UniqueName: \"kubernetes.io/projected/9c7d4b69-3286-49c0-8a83-74bcccf25345-kube-api-access-t2f4l\") pod \"route-controller-manager-6576b87f9c-t2qjv\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.972835 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:57 crc kubenswrapper[4763]: E0930 13:37:57.973361 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.473346569 +0000 UTC m=+150.611906854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:57 crc kubenswrapper[4763]: I0930 13:37:57.998313 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-p5rvt"] Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.002656 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.011734 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.014050 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.019257 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee65800a-94a4-43c1-a5bf-3a7b889619d8-trusted-ca\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.034209 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9rcjp"] Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.034738 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.035072 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.039767 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee65800a-94a4-43c1-a5bf-3a7b889619d8-metrics-tls\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.053622 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.063233 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-default-certificate\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.074869 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.075180 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.075544 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.075637 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.575590333 +0000 UTC m=+150.714150618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.075885 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.076481 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.576466244 +0000 UTC m=+150.715026589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.090062 4763 request.go:700] Waited for 1.004106373s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.095041 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: W0930 13:37:58.100022 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eaed9c6_6995_4062_8c6b_a41853220149.slice/crio-ad95d5e681e349a692a53b0315c14fd2214619cc565927e1b70d9382d6eb6438 WatchSource:0}: Error finding container ad95d5e681e349a692a53b0315c14fd2214619cc565927e1b70d9382d6eb6438: Status 404 returned error can't find the container with id ad95d5e681e349a692a53b0315c14fd2214619cc565927e1b70d9382d6eb6438 Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.116053 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.123317 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-metrics-certs\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.133330 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.140498 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-service-ca-bundle\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.153008 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.174258 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.177229 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.177318 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.677300245 +0000 UTC m=+150.815860530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.177470 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.177945 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.677928609 +0000 UTC m=+150.816488974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.183572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-stats-auth\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.211778 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.217231 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5khf\" (UniqueName: \"kubernetes.io/projected/ee6b454d-afd6-400e-8f72-1880a5485abf-kube-api-access-w5khf\") pod \"apiserver-7bbb656c7d-lnk86\" (UID: \"ee6b454d-afd6-400e-8f72-1880a5485abf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.223082 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8920a637-88bb-4ecd-b699-7f66dd955746-proxy-tls\") pod \"machine-config-controller-84d6567774-ndxvt\" (UID: \"8920a637-88bb-4ecd-b699-7f66dd955746\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.233593 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.251326 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.273112 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.278930 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.778908675 +0000 UTC m=+150.917468960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.279761 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.280200 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.280729 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.780719566 +0000 UTC m=+150.919279851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.293124 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.302491 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/318791e7-bdca-4695-9910-b9162ff85baf-metrics-tls\") pod \"dns-operator-744455d44c-m47xm\" (UID: \"318791e7-bdca-4695-9910-b9162ff85baf\") " pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.304069 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6ts49"] Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.313756 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm"] Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.313848 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.316507 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f7g5q"] Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.333081 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.340926 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2444e42-08b2-4e35-ae89-e2666a8fd3b6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ddx8p\" (UID: \"a2444e42-08b2-4e35-ae89-e2666a8fd3b6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.355034 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-64zqx"] Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.355189 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" event={"ID":"5e6e5780-d702-4c2e-9045-3e74bb98136a","Type":"ContainerStarted","Data":"28c73a213de6c8ec410219e24e5f51b54feef7a4b0f003abfe4c06485504e56b"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.357633 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.359980 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jqx2k"] Sep 30 13:37:58 crc kubenswrapper[4763]: W0930 13:37:58.364546 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4464e7_c998_48f2_bac7_bf0da585931e.slice/crio-43b3ab15046b8514276ec93b659bba11a2752313dfebaf57a673b505a8100ec7 WatchSource:0}: Error finding container 43b3ab15046b8514276ec93b659bba11a2752313dfebaf57a673b505a8100ec7: Status 404 returned error can't find the container with id 43b3ab15046b8514276ec93b659bba11a2752313dfebaf57a673b505a8100ec7 Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.365305 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/66560dbc-ad52-44b6-ad49-a8c83e403714-signing-key\") pod \"service-ca-9c57cc56f-ldk4q\" (UID: \"66560dbc-ad52-44b6-ad49-a8c83e403714\") " pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.368166 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" event={"ID":"c86dd83f-1362-49d9-aecb-9e86cb66ebcd","Type":"ContainerStarted","Data":"c4b6891ad33088dfb96981578ff1fad47acdb4da2c84568c0c8eec8df5874b49"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.368222 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" event={"ID":"c86dd83f-1362-49d9-aecb-9e86cb66ebcd","Type":"ContainerStarted","Data":"7f98bea4bcc884cd4e8dc100d1e70fe4d241ad9178ab8ff620d70b528e88ea06"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.368242 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" event={"ID":"c86dd83f-1362-49d9-aecb-9e86cb66ebcd","Type":"ContainerStarted","Data":"fab5c02bf1e30880f4640255c2afd9150aaae795bae0326e254c0a86c93fe3ec"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.370948 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv"] Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.371788 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.377350 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" event={"ID":"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b","Type":"ContainerStarted","Data":"35309be27a79a571d3b89efd70ea92f1bcecdcec7995fecb739c4d1e2f9f35dd"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.377392 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" event={"ID":"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b","Type":"ContainerStarted","Data":"2bef99ffddaeed2b89c9cb3cdbc530b0931c1b8440a0a02e5733915083344896"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.378775 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" event={"ID":"9eaed9c6-6995-4062-8c6b-a41853220149","Type":"ContainerStarted","Data":"cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.378827 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.378841 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" event={"ID":"9eaed9c6-6995-4062-8c6b-a41853220149","Type":"ContainerStarted","Data":"ad95d5e681e349a692a53b0315c14fd2214619cc565927e1b70d9382d6eb6438"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.380393 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" event={"ID":"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322","Type":"ContainerStarted","Data":"24acf3d09e74b6233aac92f57c684f90a5da21bfaacf8386ccbb56ecca14b75e"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.380721 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/66560dbc-ad52-44b6-ad49-a8c83e403714-signing-cabundle\") pod \"service-ca-9c57cc56f-ldk4q\" (UID: \"66560dbc-ad52-44b6-ad49-a8c83e403714\") " pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.380745 4763 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9rcjp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.380782 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" podUID="9eaed9c6-6995-4062-8c6b-a41853220149" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.380898 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.381022 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.881003585 +0000 UTC m=+151.019563880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.381535 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.381677 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" event={"ID":"2f98df76-283e-4a40-8985-e876b83119ce","Type":"ContainerStarted","Data":"bbd70db9f802a6a2f78e35a2f7133808d24889c8f322b6bbbc5d2fcbe610a42a"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.381700 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" event={"ID":"2f98df76-283e-4a40-8985-e876b83119ce","Type":"ContainerStarted","Data":"eb3afbc1458f0e1ecb94fe93a296aca9a56518e6670adb6eea930defd0bc9118"} Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.381827 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.881819184 +0000 UTC m=+151.020379469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.383292 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p5rvt" event={"ID":"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e","Type":"ContainerStarted","Data":"21f93a369f3254b198e9bff025a86352cc7394e302c69e617c4c1c02fa9a46a4"} Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.392561 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.411965 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.432801 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.451512 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.464166 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7a55a5-5e61-465e-927b-08cc7e8884a2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2qnwb\" (UID: \"1f7a55a5-5e61-465e-927b-08cc7e8884a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.473051 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.483337 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.484667 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:58.984652442 +0000 UTC m=+151.123212727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.490526 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.491990 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.505696 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7a55a5-5e61-465e-927b-08cc7e8884a2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2qnwb\" (UID: \"1f7a55a5-5e61-465e-927b-08cc7e8884a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.512274 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.531959 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.553939 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.568319 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a660fb91-7237-4d7a-9c8b-643f63bd589f-serving-cert\") pod \"service-ca-operator-777779d784-zt76t\" (UID: \"a660fb91-7237-4d7a-9c8b-643f63bd589f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.573009 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.584857 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.585222 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.085204016 +0000 UTC m=+151.223764301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.591321 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.611581 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.618800 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a660fb91-7237-4d7a-9c8b-643f63bd589f-config\") pod \"service-ca-operator-777779d784-zt76t\" (UID: \"a660fb91-7237-4d7a-9c8b-643f63bd589f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.632324 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.651678 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.661247 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-apiservice-cert\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.661797 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-webhook-cert\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.671975 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.686305 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.686426 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.186395906 +0000 UTC m=+151.324956181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.686522 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.687181 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.187058371 +0000 UTC m=+151.325618766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.692811 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.705712 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86"] Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.712122 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.731703 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.752313 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.757575 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a540445-8589-4437-b134-38ba9d38faf0-config-volume\") pod \"collect-profiles-29320650-kpklw\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.766254 4763 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.766307 4763 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.766320 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-metrics-tls podName:e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9 nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.266304421 +0000 UTC m=+151.404864706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-metrics-tls") pod "dns-default-bdwgk" (UID: "e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9") : failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.766357 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-config-volume podName:e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9 nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.266336302 +0000 UTC m=+151.404896587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-config-volume") pod "dns-default-bdwgk" (UID: "e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9") : failed to sync configmap cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.767387 4763 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-operator-config: failed to sync configmap cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.767424 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-config podName:d11299ad-6339-4844-8031-d517a5535b1b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.267416136 +0000 UTC m=+151.405976421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-config") pod "etcd-operator-b45778765-7czqk" (UID: "d11299ad-6339-4844-8031-d517a5535b1b") : failed to sync configmap cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.767447 4763 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.767470 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-certs podName:0c3003e9-34dc-492f-8cd2-ca6851841117 nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.267464428 +0000 UTC m=+151.406024703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-certs") pod "machine-config-server-pfmnw" (UID: "0c3003e9-34dc-492f-8cd2-ca6851841117") : failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.767828 4763 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.767880 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-serving-cert podName:d11299ad-6339-4844-8031-d517a5535b1b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.267868877 +0000 UTC m=+151.406429162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-serving-cert") pod "etcd-operator-b45778765-7czqk" (UID: "d11299ad-6339-4844-8031-d517a5535b1b") : failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.768996 4763 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-client: failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.769038 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-etcd-client podName:d11299ad-6339-4844-8031-d517a5535b1b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.269028934 +0000 UTC m=+151.407589219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-etcd-client") pod "etcd-operator-b45778765-7czqk" (UID: "d11299ad-6339-4844-8031-d517a5535b1b") : failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.769051 4763 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.769066 4763 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.769083 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-service-ca podName:d11299ad-6339-4844-8031-d517a5535b1b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.269074895 +0000 UTC m=+151.407635180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-service-ca" (UniqueName: "kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-service-ca") pod "etcd-operator-b45778765-7czqk" (UID: "d11299ad-6339-4844-8031-d517a5535b1b") : failed to sync configmap cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.769089 4763 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.769080 4763 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.769094 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e350689d-81e5-4fc9-a346-b57a553f39fd-cert podName:e350689d-81e5-4fc9-a346-b57a553f39fd nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.269089226 +0000 UTC m=+151.407649511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e350689d-81e5-4fc9-a346-b57a553f39fd-cert") pod "ingress-canary-55k7x" (UID: "e350689d-81e5-4fc9-a346-b57a553f39fd") : failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.769185 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-node-bootstrap-token podName:0c3003e9-34dc-492f-8cd2-ca6851841117 nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.269179008 +0000 UTC m=+151.407739293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-node-bootstrap-token") pod "machine-config-server-pfmnw" (UID: "0c3003e9-34dc-492f-8cd2-ca6851841117") : failed to sync secret cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.769200 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-ca podName:d11299ad-6339-4844-8031-d517a5535b1b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.269192228 +0000 UTC m=+151.407752513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-ca" (UniqueName: "kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-ca") pod "etcd-operator-b45778765-7czqk" (UID: "d11299ad-6339-4844-8031-d517a5535b1b") : failed to sync configmap cache: timed out waiting for the condition Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.772316 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 13:37:58 crc kubenswrapper[4763]: W0930 13:37:58.786860 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee6b454d_afd6_400e_8f72_1880a5485abf.slice/crio-8b195bf1371dc4731a48f3de28eaba5657146ae48a5b20d108a42ec515d07f81 WatchSource:0}: Error finding container 8b195bf1371dc4731a48f3de28eaba5657146ae48a5b20d108a42ec515d07f81: Status 404 returned error can't find the container with id 8b195bf1371dc4731a48f3de28eaba5657146ae48a5b20d108a42ec515d07f81 Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.787883 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.788261 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.28823846 +0000 UTC m=+151.426798745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.788549 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.788920 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.288902555 +0000 UTC m=+151.427462840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.791447 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.812899 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.833130 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.851968 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.872232 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.890199 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.890402 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.390371001 +0000 UTC m=+151.528931296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.890551 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.891085 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.391066097 +0000 UTC m=+151.529626452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.893183 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.913066 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.942347 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.952686 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.972622 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.992165 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.992496 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.492476282 +0000 UTC m=+151.631036567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.992803 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:58 crc kubenswrapper[4763]: E0930 13:37:58.993202 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.493189869 +0000 UTC m=+151.631750154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:58 crc kubenswrapper[4763]: I0930 13:37:58.996466 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.012620 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.037033 4763 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.052212 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.094775 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:59 crc kubenswrapper[4763]: E0930 13:37:59.095686 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.595667398 +0000 UTC m=+151.734227693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.102581 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnng\" (UniqueName: \"kubernetes.io/projected/eb3a4cfd-1db3-488c-ba4c-bd04add6bd05-kube-api-access-smnng\") pod \"downloads-7954f5f757-f228d\" (UID: \"eb3a4cfd-1db3-488c-ba4c-bd04add6bd05\") " pod="openshift-console/downloads-7954f5f757-f228d" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.107910 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ckscg\" (UID: \"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.110591 4763 request.go:700] Waited for 1.954798047s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.129284 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94s87\" (UniqueName: \"kubernetes.io/projected/589a627b-9da3-4db1-80b4-93c2e444bd17-kube-api-access-94s87\") pod \"openshift-config-operator-7777fb866f-8gtwq\" (UID: \"589a627b-9da3-4db1-80b4-93c2e444bd17\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.149435 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.152273 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.172846 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.192718 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.197272 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:59 crc kubenswrapper[4763]: E0930 13:37:59.197683 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.697670536 +0000 UTC m=+151.836230821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.212634 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.232914 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.256455 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.268658 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.275909 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f228d" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.291379 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97399be7-e1a2-4803-b32d-3c2490e98204-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nkszw\" (UID: \"97399be7-e1a2-4803-b32d-3c2490e98204\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.298570 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.298832 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-serving-cert\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.298942 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-service-ca\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.298978 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-etcd-client\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: E0930 13:37:59.299039 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.799007759 +0000 UTC m=+151.937568044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.299137 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-ca\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.299281 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-config-volume\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.299315 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-metrics-tls\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.299438 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-certs\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.299505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-node-bootstrap-token\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.299539 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e350689d-81e5-4fc9-a346-b57a553f39fd-cert\") pod \"ingress-canary-55k7x\" (UID: \"e350689d-81e5-4fc9-a346-b57a553f39fd\") " pod="openshift-ingress-canary/ingress-canary-55k7x" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.299631 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-config\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.300045 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-service-ca\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.300311 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-config\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.300372 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-config-volume\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.301246 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d11299ad-6339-4844-8031-d517a5535b1b-etcd-ca\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.303102 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-etcd-client\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.304162 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11299ad-6339-4844-8031-d517a5535b1b-serving-cert\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.304373 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-node-bootstrap-token\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.305530 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e350689d-81e5-4fc9-a346-b57a553f39fd-cert\") pod \"ingress-canary-55k7x\" (UID: \"e350689d-81e5-4fc9-a346-b57a553f39fd\") " pod="openshift-ingress-canary/ingress-canary-55k7x" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.305577 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-metrics-tls\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.306166 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0c3003e9-34dc-492f-8cd2-ca6851841117-certs\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.309581 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbgf\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-kube-api-access-spbgf\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.329327 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vp8l\" (UniqueName: \"kubernetes.io/projected/0e1dec7f-15b7-44fd-8905-9084995950c2-kube-api-access-5vp8l\") pod \"multus-admission-controller-857f4d67dd-88dg6\" (UID: \"0e1dec7f-15b7-44fd-8905-9084995950c2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.349669 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-bound-sa-token\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.373484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db706ef4-d8fb-438b-96e5-7fef497272a0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kq9tr\" (UID: \"db706ef4-d8fb-438b-96e5-7fef497272a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.411294 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p5rvt" event={"ID":"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e","Type":"ContainerStarted","Data":"062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.417417 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnmwm\" (UniqueName: \"kubernetes.io/projected/97702a69-e0ad-47b9-b8b7-d32fadc9185e-kube-api-access-jnmwm\") pod \"csi-hostpathplugin-59chv\" (UID: \"97702a69-e0ad-47b9-b8b7-d32fadc9185e\") " pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.418721 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" event={"ID":"9c7d4b69-3286-49c0-8a83-74bcccf25345","Type":"ContainerStarted","Data":"53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.418754 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" event={"ID":"9c7d4b69-3286-49c0-8a83-74bcccf25345","Type":"ContainerStarted","Data":"cfafc468c2a9234b54347475fcdc3b093ff2dcff98dfb0cd339e5711d3684228"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.420862 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.420931 4763 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-t2qjv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.420982 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" podUID="9c7d4b69-3286-49c0-8a83-74bcccf25345" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.422048 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-f7g5q" event={"ID":"5925c888-ce34-47a0-aa48-ee913adef673","Type":"ContainerStarted","Data":"861ee99f0b314c79395b9c2b60064dde8ced6325204e39375670e5aed6e38cdb"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.422087 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-f7g5q" event={"ID":"5925c888-ce34-47a0-aa48-ee913adef673","Type":"ContainerStarted","Data":"53a4fc6e7dcf11a0092338de3cf034f5234ae169fe9289e6e06f59ecd774a8c9"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.422513 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.425815 4763 patch_prober.go:28] interesting pod/console-operator-58897d9998-f7g5q container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.425887 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-f7g5q" podUID="5925c888-ce34-47a0-aa48-ee913adef673" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.426792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.429034 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee65800a-94a4-43c1-a5bf-3a7b889619d8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.429599 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-747qm\" (UniqueName: \"kubernetes.io/projected/ef8efd92-e09f-42ee-8f4b-29ef7f6253c0-kube-api-access-747qm\") pod \"packageserver-d55dfcdfc-m74w9\" (UID: \"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.435990 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:37:59 crc kubenswrapper[4763]: E0930 13:37:59.437736 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:37:59.937718139 +0000 UTC m=+152.076278424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.443054 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" event={"ID":"9d4464e7-c998-48f2-bac7-bf0da585931e","Type":"ContainerStarted","Data":"aee8762130dad0ba504eadb080367b0fe4a05fd141e3c10e401d273cd56a20aa"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.443104 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" event={"ID":"9d4464e7-c998-48f2-bac7-bf0da585931e","Type":"ContainerStarted","Data":"43b3ab15046b8514276ec93b659bba11a2752313dfebaf57a673b505a8100ec7"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.454805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b59zj\" (UniqueName: \"kubernetes.io/projected/85d6ec09-077d-4958-b8c7-d09dd9c45e29-kube-api-access-b59zj\") pod \"machine-config-operator-74547568cd-xvk2v\" (UID: \"85d6ec09-077d-4958-b8c7-d09dd9c45e29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.455557 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" event={"ID":"817d1626-d4a3-4df7-bbbd-0ae698936819","Type":"ContainerStarted","Data":"e945220fceb7115ab17af80702d133d21cb46bc7ed457e772d9042dcfedba3ba"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.455704 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" event={"ID":"817d1626-d4a3-4df7-bbbd-0ae698936819","Type":"ContainerStarted","Data":"2fc306944d59c01dbd066a1b62d195a4dc40a2fde40d46982b8ab21fd6c75bc3"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.455722 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" event={"ID":"817d1626-d4a3-4df7-bbbd-0ae698936819","Type":"ContainerStarted","Data":"b181cdaa4bc56029f6064e7e0e6ebd3a8a08e9e24a1afe41898cf1676bfd0f15"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.457340 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" event={"ID":"5e6e5780-d702-4c2e-9045-3e74bb98136a","Type":"ContainerStarted","Data":"cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.457667 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.463545 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" event={"ID":"76012636-5dea-475a-bc3e-bdcac5a79760","Type":"ContainerStarted","Data":"ea4cf0eb5b2db237c7b16a1107242b680245ae336883934412784b9cea715b76"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.463744 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" event={"ID":"76012636-5dea-475a-bc3e-bdcac5a79760","Type":"ContainerStarted","Data":"c69b40e48718004c08d3ce615d0d729a6ba0e3a40efe702a18b8998b52c9432d"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.477420 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" event={"ID":"4be31ea9-a6d5-485e-bb3b-e3b4e4fdf77b","Type":"ContainerStarted","Data":"b194e0d58894dc1a3022bd9cd5234bf50134481e4c48c7f4260e0e1cfab7f3b0"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.479303 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw5ff\" (UniqueName: \"kubernetes.io/projected/e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9-kube-api-access-fw5ff\") pod \"dns-default-bdwgk\" (UID: \"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9\") " pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.481883 4763 generic.go:334] "Generic (PLEG): container finished" podID="f2c347bc-ec1b-4ead-b9c8-f8a3443c2322" containerID="21c8c109d8540acc675a66df6a9718b218151a8afc0b1dbc5ffb6c886a3cc120" exitCode=0 Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.481948 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" event={"ID":"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322","Type":"ContainerDied","Data":"21c8c109d8540acc675a66df6a9718b218151a8afc0b1dbc5ffb6c886a3cc120"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.484269 4763 generic.go:334] "Generic (PLEG): container finished" podID="ee6b454d-afd6-400e-8f72-1880a5485abf" containerID="23f896b940e16c238b949dc192a20593dc309fbd638266c3831c15a103b44327" exitCode=0 Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.485835 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" event={"ID":"ee6b454d-afd6-400e-8f72-1880a5485abf","Type":"ContainerDied","Data":"23f896b940e16c238b949dc192a20593dc309fbd638266c3831c15a103b44327"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.485861 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" event={"ID":"ee6b454d-afd6-400e-8f72-1880a5485abf","Type":"ContainerStarted","Data":"8b195bf1371dc4731a48f3de28eaba5657146ae48a5b20d108a42ec515d07f81"} Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.487302 4763 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9rcjp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.487337 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" podUID="9eaed9c6-6995-4062-8c6b-a41853220149" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.498535 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qrh4\" (UniqueName: \"kubernetes.io/projected/7c634595-a271-4ae0-8477-39ef345aa87b-kube-api-access-4qrh4\") pod \"migrator-59844c95c7-w2tnd\" (UID: \"7c634595-a271-4ae0-8477-39ef345aa87b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.507934 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg"] Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.511563 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-59chv" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.512399 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2nsb\" (UniqueName: \"kubernetes.io/projected/8920a637-88bb-4ecd-b699-7f66dd955746-kube-api-access-c2nsb\") pod \"machine-config-controller-84d6567774-ndxvt\" (UID: \"8920a637-88bb-4ecd-b699-7f66dd955746\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.520437 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq"] Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.523840 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bdwgk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.528620 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:59 crc kubenswrapper[4763]: E0930 13:37:59.528867 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.028848245 +0000 UTC m=+152.167408530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.530370 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:59 crc kubenswrapper[4763]: E0930 13:37:59.534144 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.034127858 +0000 UTC m=+152.172688243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.537712 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srz4b\" (UniqueName: \"kubernetes.io/projected/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-kube-api-access-srz4b\") pod \"marketplace-operator-79b997595-dqjfv\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.539212 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.569782 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqvbq\" (UniqueName: \"kubernetes.io/projected/d11299ad-6339-4844-8031-d517a5535b1b-kube-api-access-mqvbq\") pod \"etcd-operator-b45778765-7czqk\" (UID: \"d11299ad-6339-4844-8031-d517a5535b1b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.572402 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjp4x\" (UniqueName: \"kubernetes.io/projected/dfb46cdc-6bb1-48a9-a80f-3330b65f96cf-kube-api-access-fjp4x\") pod \"olm-operator-6b444d44fb-g4x6h\" (UID: \"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.585897 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.589757 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.604354 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zq6k\" (UniqueName: \"kubernetes.io/projected/e350689d-81e5-4fc9-a346-b57a553f39fd-kube-api-access-2zq6k\") pod \"ingress-canary-55k7x\" (UID: \"e350689d-81e5-4fc9-a346-b57a553f39fd\") " pod="openshift-ingress-canary/ingress-canary-55k7x" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.609349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ltnp\" (UniqueName: \"kubernetes.io/projected/ee65800a-94a4-43c1-a5bf-3a7b889619d8-kube-api-access-6ltnp\") pod \"ingress-operator-5b745b69d9-6k46j\" (UID: \"ee65800a-94a4-43c1-a5bf-3a7b889619d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:59 crc kubenswrapper[4763]: W0930 13:37:59.609486 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod589a627b_9da3_4db1_80b4_93c2e444bd17.slice/crio-b7ebc6479c7813f4e3bd3eb083bc8a2c70d4ef8b355ae1c0714bfc7e42b15334 WatchSource:0}: Error finding container b7ebc6479c7813f4e3bd3eb083bc8a2c70d4ef8b355ae1c0714bfc7e42b15334: Status 404 returned error can't find the container with id b7ebc6479c7813f4e3bd3eb083bc8a2c70d4ef8b355ae1c0714bfc7e42b15334 Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.613779 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f228d"] Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.614868 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.631072 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.631814 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.633861 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n84z7\" (UniqueName: \"kubernetes.io/projected/63d0c459-8206-47bc-991f-2c3a1ed20a4f-kube-api-access-n84z7\") pod \"openshift-controller-manager-operator-756b6f6bc6-7jfqq\" (UID: \"63d0c459-8206-47bc-991f-2c3a1ed20a4f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:59 crc kubenswrapper[4763]: E0930 13:37:59.634000 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.133973626 +0000 UTC m=+152.272533911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.637149 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.653210 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.658026 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fldhw\" (UniqueName: \"kubernetes.io/projected/1f7a55a5-5e61-465e-927b-08cc7e8884a2-kube-api-access-fldhw\") pod \"kube-storage-version-migrator-operator-b67b599dd-2qnwb\" (UID: \"1f7a55a5-5e61-465e-927b-08cc7e8884a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.667139 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.676206 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt68q\" (UniqueName: \"kubernetes.io/projected/d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3-kube-api-access-wt68q\") pod \"router-default-5444994796-gdlbd\" (UID: \"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3\") " pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.691456 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlxsj\" (UniqueName: \"kubernetes.io/projected/318791e7-bdca-4695-9910-b9162ff85baf-kube-api-access-wlxsj\") pod \"dns-operator-744455d44c-m47xm\" (UID: \"318791e7-bdca-4695-9910-b9162ff85baf\") " pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.697816 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.707543 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9"] Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.714223 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65jt9\" (UniqueName: \"kubernetes.io/projected/0c3003e9-34dc-492f-8cd2-ca6851841117-kube-api-access-65jt9\") pod \"machine-config-server-pfmnw\" (UID: \"0c3003e9-34dc-492f-8cd2-ca6851841117\") " pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.736530 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:59 crc kubenswrapper[4763]: E0930 13:37:59.736968 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.236952907 +0000 UTC m=+152.375513192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.739971 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzbks\" (UniqueName: \"kubernetes.io/projected/a2444e42-08b2-4e35-ae89-e2666a8fd3b6-kube-api-access-hzbks\") pod \"package-server-manager-789f6589d5-ddx8p\" (UID: \"a2444e42-08b2-4e35-ae89-e2666a8fd3b6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.740317 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.753926 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhwtj\" (UniqueName: \"kubernetes.io/projected/b20ac013-c0be-4b7a-b5a8-cd6db89814ee-kube-api-access-nhwtj\") pod \"control-plane-machine-set-operator-78cbb6b69f-ms8hj\" (UID: \"b20ac013-c0be-4b7a-b5a8-cd6db89814ee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.770972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvl9\" (UniqueName: \"kubernetes.io/projected/66560dbc-ad52-44b6-ad49-a8c83e403714-kube-api-access-4rvl9\") pod \"service-ca-9c57cc56f-ldk4q\" (UID: \"66560dbc-ad52-44b6-ad49-a8c83e403714\") " pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.779302 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.780790 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-55k7x" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.807098 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.817220 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pvq8\" (UniqueName: \"kubernetes.io/projected/a660fb91-7237-4d7a-9c8b-643f63bd589f-kube-api-access-6pvq8\") pod \"service-ca-operator-777779d784-zt76t\" (UID: \"a660fb91-7237-4d7a-9c8b-643f63bd589f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.817630 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4jgt\" (UniqueName: \"kubernetes.io/projected/eb5fc3ab-ae64-4fea-ba4f-010a25c6791c-kube-api-access-v4jgt\") pod \"catalog-operator-68c6474976-xbmq9\" (UID: \"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.831437 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tplb6\" (UniqueName: \"kubernetes.io/projected/6a540445-8589-4437-b134-38ba9d38faf0-kube-api-access-tplb6\") pod \"collect-profiles-29320650-kpklw\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.831681 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pfmnw" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.838547 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:37:59 crc kubenswrapper[4763]: E0930 13:37:59.838975 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.338957616 +0000 UTC m=+152.477517901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.897540 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.915972 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.942288 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:37:59 crc kubenswrapper[4763]: E0930 13:37:59.942620 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.442592612 +0000 UTC m=+152.581152897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.946141 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.961927 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.973172 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.987921 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" Sep 30 13:37:59 crc kubenswrapper[4763]: I0930 13:37:59.989389 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.020864 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.056752 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.057544 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:00 crc kubenswrapper[4763]: E0930 13:38:00.057762 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.557745865 +0000 UTC m=+152.696306150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.057827 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:00 crc kubenswrapper[4763]: E0930 13:38:00.058113 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.558106203 +0000 UTC m=+152.696666478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.159669 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:00 crc kubenswrapper[4763]: E0930 13:38:00.160135 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.660119312 +0000 UTC m=+152.798679597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.187546 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-59chv"] Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.264521 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:00 crc kubenswrapper[4763]: E0930 13:38:00.265035 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.765017707 +0000 UTC m=+152.903578002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.368721 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:00 crc kubenswrapper[4763]: E0930 13:38:00.369198 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.869180886 +0000 UTC m=+153.007741171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.470556 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:00 crc kubenswrapper[4763]: E0930 13:38:00.471393 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:00.971377608 +0000 UTC m=+153.109937893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.510462 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gdlbd" event={"ID":"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3","Type":"ContainerStarted","Data":"9321172c01b951b040e57da3e82e014844652b4dffacd280d4e2824371b77851"} Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.510504 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" event={"ID":"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff","Type":"ContainerStarted","Data":"1fadb59625fbbe2ce4bc483325a2371767d2b28e4fcd929cb4ea51a27f4f47ce"} Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.518141 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" event={"ID":"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0","Type":"ContainerStarted","Data":"71543965db2027d3cf83a1c9b3548190c3a2b63eba38532a2eb11228fe292ada"} Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.542167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f228d" event={"ID":"eb3a4cfd-1db3-488c-ba4c-bd04add6bd05","Type":"ContainerStarted","Data":"b1ab3a7631e778a3152a229fd94943efd49fd6d270ee344ce41d5ec1f676998d"} Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.560714 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pfmnw" event={"ID":"0c3003e9-34dc-492f-8cd2-ca6851841117","Type":"ContainerStarted","Data":"df4e891d38e67f85c70cfb840a8e16fc6c4a9c75cc9b9a472e69bfd276255fb3"} Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.579847 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:00 crc kubenswrapper[4763]: E0930 13:38:00.580236 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.080219306 +0000 UTC m=+153.218779591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.586755 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" event={"ID":"589a627b-9da3-4db1-80b4-93c2e444bd17","Type":"ContainerStarted","Data":"b7ebc6479c7813f4e3bd3eb083bc8a2c70d4ef8b355ae1c0714bfc7e42b15334"} Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.599474 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.685295 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:00 crc kubenswrapper[4763]: E0930 13:38:00.693775 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.193761852 +0000 UTC m=+153.332322137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.730635 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-f7g5q" Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.771618 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" podStartSLOduration=126.771579969 podStartE2EDuration="2m6.771579969s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:00.769714786 +0000 UTC m=+152.908275071" watchObservedRunningTime="2025-09-30 13:38:00.771579969 +0000 UTC m=+152.910140254" Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.798858 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:00 crc kubenswrapper[4763]: E0930 13:38:00.799523 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.299508147 +0000 UTC m=+153.438068432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.820055 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l7pfj" podStartSLOduration=126.820038383 podStartE2EDuration="2m6.820038383s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:00.818795895 +0000 UTC m=+152.957356180" watchObservedRunningTime="2025-09-30 13:38:00.820038383 +0000 UTC m=+152.958598668" Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.858090 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wrz4k" podStartSLOduration=126.858063527 podStartE2EDuration="2m6.858063527s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:00.854281099 +0000 UTC m=+152.992841404" watchObservedRunningTime="2025-09-30 13:38:00.858063527 +0000 UTC m=+152.996623812" Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.903282 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:00 crc kubenswrapper[4763]: E0930 13:38:00.903705 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.403686156 +0000 UTC m=+153.542246491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:00 crc kubenswrapper[4763]: I0930 13:38:00.971563 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jqx2k" podStartSLOduration=126.971545302 podStartE2EDuration="2m6.971545302s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:00.97146488 +0000 UTC m=+153.110025165" watchObservedRunningTime="2025-09-30 13:38:00.971545302 +0000 UTC m=+153.110105577" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.004444 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.004704 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.50464723 +0000 UTC m=+153.643207515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.004934 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.005425 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.505417878 +0000 UTC m=+153.643978163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.109051 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.109414 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.609383262 +0000 UTC m=+153.747943547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.109671 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.110057 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.610040368 +0000 UTC m=+153.748600653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.153773 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" podStartSLOduration=127.153752502 podStartE2EDuration="2m7.153752502s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:01.092533551 +0000 UTC m=+153.231093846" watchObservedRunningTime="2025-09-30 13:38:01.153752502 +0000 UTC m=+153.292312787" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.216916 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.217230 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.717217035 +0000 UTC m=+153.855777320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.294782 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x2jhk" podStartSLOduration=127.294739806 podStartE2EDuration="2m7.294739806s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:01.252733221 +0000 UTC m=+153.391293506" watchObservedRunningTime="2025-09-30 13:38:01.294739806 +0000 UTC m=+153.433300091" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.318341 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.318750 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.818733943 +0000 UTC m=+153.957294228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.330434 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dqjfv"] Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.369657 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr"] Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.381133 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bdwgk"] Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.381184 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw"] Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.419510 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.420075 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:01.920054245 +0000 UTC m=+154.058614530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.521682 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.522050 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:02.022035983 +0000 UTC m=+154.160596268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.549909 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-p5rvt" podStartSLOduration=127.54988889 podStartE2EDuration="2m7.54988889s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:01.493945011 +0000 UTC m=+153.632505296" watchObservedRunningTime="2025-09-30 13:38:01.54988889 +0000 UTC m=+153.688449175" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.551887 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lvdnm" podStartSLOduration=127.551876966 podStartE2EDuration="2m7.551876966s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:01.547724249 +0000 UTC m=+153.686284544" watchObservedRunningTime="2025-09-30 13:38:01.551876966 +0000 UTC m=+153.690437251" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.598723 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" event={"ID":"35d4ddd9-8b2f-435d-aa88-9cdd4b1f88ff","Type":"ContainerStarted","Data":"f8e09a0f1cd2d83534972f5337f85d379251602d118f8b61a05c816796d9852d"} Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.603675 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" event={"ID":"ef8efd92-e09f-42ee-8f4b-29ef7f6253c0","Type":"ContainerStarted","Data":"3f9f6d3529fad2d1337e454c2e985fa8c4f624eb569a489f512eb0ad9ece7af5"} Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.604207 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.610866 4763 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m74w9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.611146 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" podUID="ef8efd92-e09f-42ee-8f4b-29ef7f6253c0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.614936 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-f7g5q" podStartSLOduration=127.614915999 podStartE2EDuration="2m7.614915999s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:01.614285105 +0000 UTC m=+153.752845390" watchObservedRunningTime="2025-09-30 13:38:01.614915999 +0000 UTC m=+153.753476284" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.620675 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pfmnw" event={"ID":"0c3003e9-34dc-492f-8cd2-ca6851841117","Type":"ContainerStarted","Data":"26e5cba096cf5f415dfa721397a395749bc42d33857273d26d5366f904b8bfa2"} Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.623260 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.623847 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:02.123831507 +0000 UTC m=+154.262391792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.637154 4763 generic.go:334] "Generic (PLEG): container finished" podID="589a627b-9da3-4db1-80b4-93c2e444bd17" containerID="80ee3954e8f230b746544105b14baef5a6c4392e4bbf36959119779a67697ebf" exitCode=0 Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.637242 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" event={"ID":"589a627b-9da3-4db1-80b4-93c2e444bd17","Type":"ContainerDied","Data":"80ee3954e8f230b746544105b14baef5a6c4392e4bbf36959119779a67697ebf"} Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.644890 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-59chv" event={"ID":"97702a69-e0ad-47b9-b8b7-d32fadc9185e","Type":"ContainerStarted","Data":"d7e212b1797b7534d5bde6aca8a1d8ba0b9bb4b5ed97d4c229f8ffd971bca1e8"} Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.649369 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" event={"ID":"ee6b454d-afd6-400e-8f72-1880a5485abf","Type":"ContainerStarted","Data":"a8fe3c481964bf3867aa1a1dc3fd17dfd932b8b8b370bc6bdc7fd027890d89d4"} Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.654421 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gdlbd" event={"ID":"d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3","Type":"ContainerStarted","Data":"4e1bb5a6f1aa2d8d4c24af5382d1cfe0b81dcaba3d66bd83db7c286f14533dc6"} Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.701264 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" event={"ID":"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322","Type":"ContainerStarted","Data":"b0e0dcf9709c381eea3060c32c5c3fd992f0d29244dfdc386c6fb61280e87bef"} Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.701333 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" event={"ID":"f2c347bc-ec1b-4ead-b9c8-f8a3443c2322","Type":"ContainerStarted","Data":"c47dd3dec58e41813decaba81626b7b20839afa661cae6e2d97b899b08c377a2"} Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.747027 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.749364 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f228d" event={"ID":"eb3a4cfd-1db3-488c-ba4c-bd04add6bd05","Type":"ContainerStarted","Data":"b0d9a4057e92cab7763fc1165fb3659bd41e79ecefd5ae0a65abac50403c83e9"} Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.750276 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-f228d" Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.755398 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:02.255382381 +0000 UTC m=+154.393942666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.770729 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-f228d container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.770801 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f228d" podUID="eb3a4cfd-1db3-488c-ba4c-bd04add6bd05" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.791406 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-64zqx" podStartSLOduration=127.791385456 podStartE2EDuration="2m7.791385456s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:01.790067886 +0000 UTC m=+153.928628171" watchObservedRunningTime="2025-09-30 13:38:01.791385456 +0000 UTC m=+153.929945741" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.850226 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.851505 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:02.351489492 +0000 UTC m=+154.490049777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.894802 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" podStartSLOduration=127.894781327 podStartE2EDuration="2m7.894781327s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:01.894684015 +0000 UTC m=+154.033244300" watchObservedRunningTime="2025-09-30 13:38:01.894781327 +0000 UTC m=+154.033341612" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.955142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:01 crc kubenswrapper[4763]: E0930 13:38:01.955564 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:02.455551348 +0000 UTC m=+154.594111633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.968296 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.971043 4763 patch_prober.go:28] interesting pod/router-default-5444994796-gdlbd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:38:01 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Sep 30 13:38:01 crc kubenswrapper[4763]: [+]process-running ok Sep 30 13:38:01 crc kubenswrapper[4763]: healthz check failed Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.971078 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gdlbd" podUID="d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:38:01 crc kubenswrapper[4763]: I0930 13:38:01.988301 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h"] Sep 30 13:38:02 crc kubenswrapper[4763]: W0930 13:38:02.006164 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfb46cdc_6bb1_48a9_a80f_3330b65f96cf.slice/crio-67eca637a59f7dc3c956b868e303d5e1952345e7f01979da83ffa7560842673b WatchSource:0}: Error finding container 67eca637a59f7dc3c956b868e303d5e1952345e7f01979da83ffa7560842673b: Status 404 returned error can't find the container with id 67eca637a59f7dc3c956b868e303d5e1952345e7f01979da83ffa7560842673b Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.008514 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.048682 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gdlbd" podStartSLOduration=128.04866767 podStartE2EDuration="2m8.04866767s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.046331156 +0000 UTC m=+154.184891441" watchObservedRunningTime="2025-09-30 13:38:02.04866767 +0000 UTC m=+154.187227945" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.055868 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:02 crc kubenswrapper[4763]: E0930 13:38:02.058269 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:02.558238873 +0000 UTC m=+154.696799158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.059755 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.079348 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.107081 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-88dg6"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.108879 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" podStartSLOduration=128.108862838 podStartE2EDuration="2m8.108862838s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.076001225 +0000 UTC m=+154.214561540" watchObservedRunningTime="2025-09-30 13:38:02.108862838 +0000 UTC m=+154.247423123" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.118047 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.130798 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.158751 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:02 crc kubenswrapper[4763]: E0930 13:38:02.163855 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:02.663838334 +0000 UTC m=+154.802398619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.175285 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" podStartSLOduration=128.175270539 podStartE2EDuration="2m8.175270539s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.175240698 +0000 UTC m=+154.313800983" watchObservedRunningTime="2025-09-30 13:38:02.175270539 +0000 UTC m=+154.313830824" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.230367 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-55k7x"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.235166 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pfmnw" podStartSLOduration=5.23514337 podStartE2EDuration="5.23514337s" podCreationTimestamp="2025-09-30 13:37:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.224267127 +0000 UTC m=+154.362827412" watchObservedRunningTime="2025-09-30 13:38:02.23514337 +0000 UTC m=+154.373703655" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.235719 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m47xm"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.243862 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zt76t"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.259112 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.262642 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:02 crc kubenswrapper[4763]: E0930 13:38:02.263701 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:02.763685503 +0000 UTC m=+154.902245788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.284178 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.294087 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7czqk"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.311112 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.314441 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" podStartSLOduration=128.31441747 podStartE2EDuration="2m8.31441747s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.293798552 +0000 UTC m=+154.432358837" watchObservedRunningTime="2025-09-30 13:38:02.31441747 +0000 UTC m=+154.452977745" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.314662 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.374244 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:02 crc kubenswrapper[4763]: E0930 13:38:02.374822 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:02.874809813 +0000 UTC m=+155.013370098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.382858 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ckscg" podStartSLOduration=128.382842509 podStartE2EDuration="2m8.382842509s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.374892184 +0000 UTC m=+154.513452459" watchObservedRunningTime="2025-09-30 13:38:02.382842509 +0000 UTC m=+154.521402794" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.383161 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj"] Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.406673 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-f228d" podStartSLOduration=128.406651152 podStartE2EDuration="2m8.406651152s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.406075368 +0000 UTC m=+154.544635643" watchObservedRunningTime="2025-09-30 13:38:02.406651152 +0000 UTC m=+154.545211437" Sep 30 13:38:02 crc kubenswrapper[4763]: W0930 13:38:02.437767 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63d0c459_8206_47bc_991f_2c3a1ed20a4f.slice/crio-0979fc0ea0dead447af801c25ec0b72cc3d28617686c7c3b6028332d17bce432 WatchSource:0}: Error finding container 0979fc0ea0dead447af801c25ec0b72cc3d28617686c7c3b6028332d17bce432: Status 404 returned error can't find the container with id 0979fc0ea0dead447af801c25ec0b72cc3d28617686c7c3b6028332d17bce432 Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.448527 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ldk4q"] Sep 30 13:38:02 crc kubenswrapper[4763]: W0930 13:38:02.454832 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb20ac013_c0be_4b7a_b5a8_cd6db89814ee.slice/crio-670175cc43b0e41b38507db650a1c9b524f8c01cd3e7c6ec7b89814e1e3ff530 WatchSource:0}: Error finding container 670175cc43b0e41b38507db650a1c9b524f8c01cd3e7c6ec7b89814e1e3ff530: Status 404 returned error can't find the container with id 670175cc43b0e41b38507db650a1c9b524f8c01cd3e7c6ec7b89814e1e3ff530 Sep 30 13:38:02 crc kubenswrapper[4763]: W0930 13:38:02.456886 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11299ad_6339_4844_8031_d517a5535b1b.slice/crio-876fb8f6a2644bb27d63cd5c4f3dc0a5ac986efeb2c0bd06f00a483d59f043b1 WatchSource:0}: Error finding container 876fb8f6a2644bb27d63cd5c4f3dc0a5ac986efeb2c0bd06f00a483d59f043b1: Status 404 returned error can't find the container with id 876fb8f6a2644bb27d63cd5c4f3dc0a5ac986efeb2c0bd06f00a483d59f043b1 Sep 30 13:38:02 crc kubenswrapper[4763]: W0930 13:38:02.479631 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66560dbc_ad52_44b6_ad49_a8c83e403714.slice/crio-9ed2f311c5fda6935146650e7bd36b67a1808e3ff78e4bb920b352ee10866afc WatchSource:0}: Error finding container 9ed2f311c5fda6935146650e7bd36b67a1808e3ff78e4bb920b352ee10866afc: Status 404 returned error can't find the container with id 9ed2f311c5fda6935146650e7bd36b67a1808e3ff78e4bb920b352ee10866afc Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.491785 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:02 crc kubenswrapper[4763]: E0930 13:38:02.492258 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:02.992242349 +0000 UTC m=+155.130802634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.595127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:02 crc kubenswrapper[4763]: E0930 13:38:02.595562 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:03.095549158 +0000 UTC m=+155.234109443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.696146 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:02 crc kubenswrapper[4763]: E0930 13:38:02.696799 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:03.196770978 +0000 UTC m=+155.335331263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.699298 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:02 crc kubenswrapper[4763]: E0930 13:38:02.700478 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:03.200462164 +0000 UTC m=+155.339022449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.761784 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.762062 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.764674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" event={"ID":"a2444e42-08b2-4e35-ae89-e2666a8fd3b6","Type":"ContainerStarted","Data":"7eef29dfe04c21fb963023f7e6677af6eb0bcc7f6c13aa82e1994947f191f50f"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.773584 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" event={"ID":"97399be7-e1a2-4803-b32d-3c2490e98204","Type":"ContainerStarted","Data":"c304de544224fa1d9ea0464619bd7dce73ba43c764784e962d91160848e6b113"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.773706 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" event={"ID":"97399be7-e1a2-4803-b32d-3c2490e98204","Type":"ContainerStarted","Data":"36a42606127588be192e107bed7ed9ead463791f4cd73b03a555d729e535ae6c"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.780470 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" event={"ID":"85d6ec09-077d-4958-b8c7-d09dd9c45e29","Type":"ContainerStarted","Data":"b118c724d8b9d85fa765006201d808567bed7793a5174a607a6f63a76c8d5d58"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.780509 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" event={"ID":"85d6ec09-077d-4958-b8c7-d09dd9c45e29","Type":"ContainerStarted","Data":"a342a0f660636addd01126ceac8bcfbf4ad030ad6417efe72341117e856c2adc"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.781634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" event={"ID":"a660fb91-7237-4d7a-9c8b-643f63bd589f","Type":"ContainerStarted","Data":"6cb7182951d7b7c4dc66207bc9e364c31e52334dd0c108c747f28a21c3fc10c5"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.800098 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" event={"ID":"1f7a55a5-5e61-465e-927b-08cc7e8884a2","Type":"ContainerStarted","Data":"bf8969c1afb240436495ceac66adb2da126702422b5ad8ee0b22508db28ae5c9"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.800146 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" event={"ID":"1f7a55a5-5e61-465e-927b-08cc7e8884a2","Type":"ContainerStarted","Data":"cfbd02e402bcf978224eeb37f0f5d8088cf8aa1db6c37c9e047e125a26c82a7a"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.802052 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:02 crc kubenswrapper[4763]: E0930 13:38:02.802509 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:03.302494602 +0000 UTC m=+155.441054877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.803865 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" event={"ID":"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9","Type":"ContainerStarted","Data":"02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.803933 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" event={"ID":"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9","Type":"ContainerStarted","Data":"505b873f37846c4ab50c2934b683230b40be4be7a3b5181c69fbe98d4b54c239"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.804815 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.806166 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" event={"ID":"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c","Type":"ContainerStarted","Data":"28fda578e479de625ff8259ed384cbad71f95e19056250c9c1f105cdb6120332"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.812392 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" event={"ID":"318791e7-bdca-4695-9910-b9162ff85baf","Type":"ContainerStarted","Data":"5eb740e381f4a737d98e50aa125d19eb2b53a4d7541b9cfaaf8c2dd042785fc7"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.813045 4763 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dqjfv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.813099 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" podUID="2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.820714 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nkszw" podStartSLOduration=128.820696495 podStartE2EDuration="2m8.820696495s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.803221159 +0000 UTC m=+154.941781444" watchObservedRunningTime="2025-09-30 13:38:02.820696495 +0000 UTC m=+154.959256780" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.823782 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-55k7x" event={"ID":"e350689d-81e5-4fc9-a346-b57a553f39fd","Type":"ContainerStarted","Data":"255bca1a051794aff25cb884b8064377739c6d05f5d937d932f6c9645c9b4370"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.830643 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" event={"ID":"63d0c459-8206-47bc-991f-2c3a1ed20a4f","Type":"ContainerStarted","Data":"0979fc0ea0dead447af801c25ec0b72cc3d28617686c7c3b6028332d17bce432"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.841809 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2qnwb" podStartSLOduration=128.841795024 podStartE2EDuration="2m8.841795024s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.841201201 +0000 UTC m=+154.979761486" watchObservedRunningTime="2025-09-30 13:38:02.841795024 +0000 UTC m=+154.980355309" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.868006 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" event={"ID":"8920a637-88bb-4ecd-b699-7f66dd955746","Type":"ContainerStarted","Data":"26eb937e0b51f8c0ccbdb14f3cba6cad438cc15ff63c8b64f21eb9c10590181c"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.877830 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd" event={"ID":"7c634595-a271-4ae0-8477-39ef345aa87b","Type":"ContainerStarted","Data":"2ebbea6a6154e7222c4be99111b1f9d4eb68d789bb11ec4287b75d2338a4285f"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.879442 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" event={"ID":"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf","Type":"ContainerStarted","Data":"49eb55d5a6817510a99fcf8e140b97dd80d4fce8849d7bd59f9d1bee1a0e6c74"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.879464 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" event={"ID":"dfb46cdc-6bb1-48a9-a80f-3330b65f96cf","Type":"ContainerStarted","Data":"67eca637a59f7dc3c956b868e303d5e1952345e7f01979da83ffa7560842673b"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.880572 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.883996 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" event={"ID":"db706ef4-d8fb-438b-96e5-7fef497272a0","Type":"ContainerStarted","Data":"537e6da1c5c5f9dc2ff501e34a0e3c0b23a82330bf96ac4f550465aefa8ee03e"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.884046 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" event={"ID":"db706ef4-d8fb-438b-96e5-7fef497272a0","Type":"ContainerStarted","Data":"19afe62f483c18da9f52b6521d33ad2a60ea9819e1cfc822e87212cabc7cfbb1"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.892611 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" event={"ID":"0e1dec7f-15b7-44fd-8905-9084995950c2","Type":"ContainerStarted","Data":"8b2dafa0fc40c63f2cfe8d1b50ee61079e0693a0c4cae716936438d3b48499f8"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.904472 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:02 crc kubenswrapper[4763]: E0930 13:38:02.905193 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:03.405173407 +0000 UTC m=+155.543733692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.906297 4763 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-g4x6h container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.906333 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" podUID="dfb46cdc-6bb1-48a9-a80f-3330b65f96cf" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.911895 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" event={"ID":"d11299ad-6339-4844-8031-d517a5535b1b","Type":"ContainerStarted","Data":"876fb8f6a2644bb27d63cd5c4f3dc0a5ac986efeb2c0bd06f00a483d59f043b1"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.925123 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" podStartSLOduration=128.925103429 podStartE2EDuration="2m8.925103429s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.885359197 +0000 UTC m=+155.023919482" watchObservedRunningTime="2025-09-30 13:38:02.925103429 +0000 UTC m=+155.063663714" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.959646 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" event={"ID":"6a540445-8589-4437-b134-38ba9d38faf0","Type":"ContainerStarted","Data":"6090fa3f0e0b9e778f501308fffb3585a5c9d2c34daf2cae207338f9df4bf39b"} Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.970719 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" podStartSLOduration=128.970701398 podStartE2EDuration="2m8.970701398s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.944128431 +0000 UTC m=+155.082688716" watchObservedRunningTime="2025-09-30 13:38:02.970701398 +0000 UTC m=+155.109261683" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.985947 4763 patch_prober.go:28] interesting pod/router-default-5444994796-gdlbd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:38:02 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Sep 30 13:38:02 crc kubenswrapper[4763]: [+]process-running ok Sep 30 13:38:02 crc kubenswrapper[4763]: healthz check failed Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.986017 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gdlbd" podUID="d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:38:02 crc kubenswrapper[4763]: I0930 13:38:02.986227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" event={"ID":"66560dbc-ad52-44b6-ad49-a8c83e403714","Type":"ContainerStarted","Data":"9ed2f311c5fda6935146650e7bd36b67a1808e3ff78e4bb920b352ee10866afc"} Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.006333 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.007999 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:03.507980313 +0000 UTC m=+155.646540598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.029774 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" event={"ID":"589a627b-9da3-4db1-80b4-93c2e444bd17","Type":"ContainerStarted","Data":"2480645e4fdd4bd2065d8b223074bc685aaf778f0b4f47ff0e36d5ce96d8d561"} Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.031227 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.043489 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" event={"ID":"b20ac013-c0be-4b7a-b5a8-cd6db89814ee","Type":"ContainerStarted","Data":"670175cc43b0e41b38507db650a1c9b524f8c01cd3e7c6ec7b89814e1e3ff530"} Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.074947 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bdwgk" event={"ID":"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9","Type":"ContainerStarted","Data":"6281a0fb443513c9920cf02d48dfd684460e7beba7b7f4396052e44f7b81c0fe"} Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.074994 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bdwgk" event={"ID":"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9","Type":"ContainerStarted","Data":"5f2bf9d0f521d70e131fac5e09b5a8c3fb2d44e8f8e20970f01139ae38214294"} Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.076266 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kq9tr" podStartSLOduration=129.076247148 podStartE2EDuration="2m9.076247148s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:02.970444181 +0000 UTC m=+155.109004466" watchObservedRunningTime="2025-09-30 13:38:03.076247148 +0000 UTC m=+155.214807433" Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.077845 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" podStartSLOduration=129.077838405 podStartE2EDuration="2m9.077838405s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:03.075952362 +0000 UTC m=+155.214512647" watchObservedRunningTime="2025-09-30 13:38:03.077838405 +0000 UTC m=+155.216398690" Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.081009 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" event={"ID":"ee65800a-94a4-43c1-a5bf-3a7b889619d8","Type":"ContainerStarted","Data":"35ea2e7c29bb4c2fc84b5e86900e090aa438541be624eebde6b18293eb2248f0"} Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.085767 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-59chv" event={"ID":"97702a69-e0ad-47b9-b8b7-d32fadc9185e","Type":"ContainerStarted","Data":"0f72c150ee7ae95751d3385a27303c2c6c826f113b96bb2432d652818c2c1f73"} Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.089212 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-f228d container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.089246 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f228d" podUID="eb3a4cfd-1db3-488c-ba4c-bd04add6bd05" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.100268 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m74w9" Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.136576 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.137251 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:03.637238254 +0000 UTC m=+155.775798539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.238914 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.244986 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:03.744956495 +0000 UTC m=+155.883516790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.341919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.342237 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:03.842224884 +0000 UTC m=+155.980785169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.443786 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.444432 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:03.944411517 +0000 UTC m=+156.082971802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.491109 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.491170 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.500591 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.546242 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.546721 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.046695461 +0000 UTC m=+156.185255746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.647650 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.648302 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.148275639 +0000 UTC m=+156.286835924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.648429 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.648731 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.14872203 +0000 UTC m=+156.287282315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.749653 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.749899 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.249868188 +0000 UTC m=+156.388428473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.750163 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.750534 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.250513853 +0000 UTC m=+156.389074138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.851276 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.851885 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.351866627 +0000 UTC m=+156.490426912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.953766 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:03 crc kubenswrapper[4763]: E0930 13:38:03.954224 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.454208173 +0000 UTC m=+156.592768458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.970551 4763 patch_prober.go:28] interesting pod/router-default-5444994796-gdlbd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:38:03 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Sep 30 13:38:03 crc kubenswrapper[4763]: [+]process-running ok Sep 30 13:38:03 crc kubenswrapper[4763]: healthz check failed Sep 30 13:38:03 crc kubenswrapper[4763]: I0930 13:38:03.970657 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gdlbd" podUID="d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.054994 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.055176 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.555140976 +0000 UTC m=+156.693701281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.055608 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.055956 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.555945085 +0000 UTC m=+156.694505370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.100264 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" event={"ID":"b20ac013-c0be-4b7a-b5a8-cd6db89814ee","Type":"ContainerStarted","Data":"93a46f4394232b5247f88bbebc6be8fae85381ca753f3a81e172ae2ecd63934b"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.106300 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bdwgk" event={"ID":"e0472dbb-f3c8-4830-a0fb-0e2a4a23e5f9","Type":"ContainerStarted","Data":"e4757de408c64fe9835468858c8e462d01a77e59264f52dc0e66372fa116056f"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.106784 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-bdwgk" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.110359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-59chv" event={"ID":"97702a69-e0ad-47b9-b8b7-d32fadc9185e","Type":"ContainerStarted","Data":"e230f22bfb9ff6805b4f2e4225cea7e8de5cf9072e09552adca7b304c83962e8"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.111955 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" event={"ID":"85d6ec09-077d-4958-b8c7-d09dd9c45e29","Type":"ContainerStarted","Data":"9af66d34d40f74d1f3a1c044816f908757701c225819a38e4ffc99c3e7ea5c2b"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.113359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-55k7x" event={"ID":"e350689d-81e5-4fc9-a346-b57a553f39fd","Type":"ContainerStarted","Data":"5bedddb5ac562f6c49facc4b8ef7643c4704addd5eba8b3d8476802c87c07f58"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.115020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" event={"ID":"d11299ad-6339-4844-8031-d517a5535b1b","Type":"ContainerStarted","Data":"471a942600b5a11e80a47cc295493af8d867f9f4a3924f0cb4318bbae9572d7a"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.130068 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" event={"ID":"66560dbc-ad52-44b6-ad49-a8c83e403714","Type":"ContainerStarted","Data":"5a152d433c28089bb54667af724f6bb38d2561718aa31d3fe173a3abb342fc95"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.144290 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" event={"ID":"8920a637-88bb-4ecd-b699-7f66dd955746","Type":"ContainerStarted","Data":"94ad6bc3a366bdb525f82954c3c53cb031608d350cb904025e249e3c378780fa"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.144341 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" event={"ID":"8920a637-88bb-4ecd-b699-7f66dd955746","Type":"ContainerStarted","Data":"d1c9e0639edf5ae2d3d55f9313fe3758d1378c4bb61c19d29211fb013e25e769"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.163223 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.163914 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.663890601 +0000 UTC m=+156.802450886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.165526 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ms8hj" podStartSLOduration=130.165512669 podStartE2EDuration="2m10.165512669s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.127399374 +0000 UTC m=+156.265959659" watchObservedRunningTime="2025-09-30 13:38:04.165512669 +0000 UTC m=+156.304072954" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.166654 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bdwgk" podStartSLOduration=7.166644465 podStartE2EDuration="7.166644465s" podCreationTimestamp="2025-09-30 13:37:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.165064769 +0000 UTC m=+156.303625054" watchObservedRunningTime="2025-09-30 13:38:04.166644465 +0000 UTC m=+156.305204780" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.167268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" event={"ID":"0e1dec7f-15b7-44fd-8905-9084995950c2","Type":"ContainerStarted","Data":"b5fae7358c00ce3efb05932975f36b7bca6bfba3c856eabfd3035bda6de0b8b2"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.167378 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" event={"ID":"0e1dec7f-15b7-44fd-8905-9084995950c2","Type":"ContainerStarted","Data":"9eb8dd44f9ecedff553e51d0ff7eb3e42bbe4a5ec719006a7467ea57bfc2c513"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.193221 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-7czqk" podStartSLOduration=130.193200182 podStartE2EDuration="2m10.193200182s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.190214043 +0000 UTC m=+156.328774328" watchObservedRunningTime="2025-09-30 13:38:04.193200182 +0000 UTC m=+156.331760467" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.196313 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" event={"ID":"318791e7-bdca-4695-9910-b9162ff85baf","Type":"ContainerStarted","Data":"8dc9ce779f2f98d8ad85b94c06db6470b93a870a12faa56cdb9c8bfbde568ef3"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.196366 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" event={"ID":"318791e7-bdca-4695-9910-b9162ff85baf","Type":"ContainerStarted","Data":"d5b214314590d420650196f62d690a1e161a0359b187136f4fd150006e2dc0a6"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.211079 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xvk2v" podStartSLOduration=130.211056556 podStartE2EDuration="2m10.211056556s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.210087384 +0000 UTC m=+156.348647669" watchObservedRunningTime="2025-09-30 13:38:04.211056556 +0000 UTC m=+156.349616841" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.211291 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" event={"ID":"63d0c459-8206-47bc-991f-2c3a1ed20a4f","Type":"ContainerStarted","Data":"66b938a441c84393d5f41323e380181e137a8d5faf6c46b59a23d47ef9e1cecb"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.220756 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" event={"ID":"ee65800a-94a4-43c1-a5bf-3a7b889619d8","Type":"ContainerStarted","Data":"cedf9cf24a9bb8650eb5c79d59377e37ec37a6304ef3b99088ef345fa819a4e0"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.220803 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" event={"ID":"ee65800a-94a4-43c1-a5bf-3a7b889619d8","Type":"ContainerStarted","Data":"0eb53dce5b30db8ab1b992ac00d235b835e6211cb8cf1453acf823ac3687520c"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.233676 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-55k7x" podStartSLOduration=8.233644871 podStartE2EDuration="8.233644871s" podCreationTimestamp="2025-09-30 13:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.230748493 +0000 UTC m=+156.369308768" watchObservedRunningTime="2025-09-30 13:38:04.233644871 +0000 UTC m=+156.372205156" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.247928 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd" event={"ID":"7c634595-a271-4ae0-8477-39ef345aa87b","Type":"ContainerStarted","Data":"0c10a420b43682e4024c2e9d6726b6f4a1cd274c8e1172e3e7854a17941ec542"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.247980 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd" event={"ID":"7c634595-a271-4ae0-8477-39ef345aa87b","Type":"ContainerStarted","Data":"daa20f38d6e14491f925a0053b36e56b8d6e3caf6736605ee24da5edf09a6dbb"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.254113 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" event={"ID":"a660fb91-7237-4d7a-9c8b-643f63bd589f","Type":"ContainerStarted","Data":"b6859022b0998da485536d157a10628226b5bec08e220143a58b1ada7774cd54"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.257096 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7jfqq" podStartSLOduration=130.257083515 podStartE2EDuration="2m10.257083515s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.255973529 +0000 UTC m=+156.394533814" watchObservedRunningTime="2025-09-30 13:38:04.257083515 +0000 UTC m=+156.395643800" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.268322 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.270299 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.770271641 +0000 UTC m=+156.908831926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.282046 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" event={"ID":"eb5fc3ab-ae64-4fea-ba4f-010a25c6791c","Type":"ContainerStarted","Data":"ae50b3e7dfcf02ed783513575d540ae8c7c0fc4eb185797849c07b8ae16f5bdf"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.283236 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.284832 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndxvt" podStartSLOduration=130.284817009 podStartE2EDuration="2m10.284817009s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.282622108 +0000 UTC m=+156.421182393" watchObservedRunningTime="2025-09-30 13:38:04.284817009 +0000 UTC m=+156.423377294" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.301714 4763 patch_prober.go:28] interesting pod/apiserver-76f77b778f-6ts49 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 13:38:04 crc kubenswrapper[4763]: [+]log ok Sep 30 13:38:04 crc kubenswrapper[4763]: [+]etcd ok Sep 30 13:38:04 crc kubenswrapper[4763]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 13:38:04 crc kubenswrapper[4763]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 13:38:04 crc kubenswrapper[4763]: [+]poststarthook/max-in-flight-filter ok Sep 30 13:38:04 crc kubenswrapper[4763]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 13:38:04 crc kubenswrapper[4763]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 30 13:38:04 crc kubenswrapper[4763]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 30 13:38:04 crc kubenswrapper[4763]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Sep 30 13:38:04 crc kubenswrapper[4763]: [+]poststarthook/project.openshift.io-projectcache ok Sep 30 13:38:04 crc kubenswrapper[4763]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 30 13:38:04 crc kubenswrapper[4763]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Sep 30 13:38:04 crc kubenswrapper[4763]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 30 13:38:04 crc kubenswrapper[4763]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 13:38:04 crc kubenswrapper[4763]: livez check failed Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.301780 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" podUID="f2c347bc-ec1b-4ead-b9c8-f8a3443c2322" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.312417 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" event={"ID":"6a540445-8589-4437-b134-38ba9d38faf0","Type":"ContainerStarted","Data":"9ead3cce3b8bef2eff1fd9882d990535e94a6bcd8889b85de032de55a378c287"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.313254 4763 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xbmq9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.313396 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" podUID="eb5fc3ab-ae64-4fea-ba4f-010a25c6791c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.331117 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6k46j" podStartSLOduration=130.331099233 podStartE2EDuration="2m10.331099233s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.330223254 +0000 UTC m=+156.468783539" watchObservedRunningTime="2025-09-30 13:38:04.331099233 +0000 UTC m=+156.469659518" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.342955 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" event={"ID":"a2444e42-08b2-4e35-ae89-e2666a8fd3b6","Type":"ContainerStarted","Data":"b7f83ffbdd2ffec14941fac427d88289ba72a375671b683c343a954b6e525eba"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.342998 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.343009 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" event={"ID":"a2444e42-08b2-4e35-ae89-e2666a8fd3b6","Type":"ContainerStarted","Data":"0266c81054ca8907085a01c97942b4bc8242c313c17eae8b2bd3fd6287234cca"} Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.353399 4763 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dqjfv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.353456 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" podUID="2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.355739 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-m47xm" podStartSLOduration=130.355709655 podStartE2EDuration="2m10.355709655s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.353664287 +0000 UTC m=+156.492224572" watchObservedRunningTime="2025-09-30 13:38:04.355709655 +0000 UTC m=+156.494269940" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.372324 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.372740 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.87272121 +0000 UTC m=+157.011281515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.374225 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.379674 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.879662231 +0000 UTC m=+157.018222516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.386730 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lnk86" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.397994 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g4x6h" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.399043 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-88dg6" podStartSLOduration=130.399031721 podStartE2EDuration="2m10.399031721s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.398359245 +0000 UTC m=+156.536919530" watchObservedRunningTime="2025-09-30 13:38:04.399031721 +0000 UTC m=+156.537592006" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.449450 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ldk4q" podStartSLOduration=130.449430151 podStartE2EDuration="2m10.449430151s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.449172864 +0000 UTC m=+156.587733149" watchObservedRunningTime="2025-09-30 13:38:04.449430151 +0000 UTC m=+156.587990436" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.480665 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.482267 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:04.982224482 +0000 UTC m=+157.120784767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.487260 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zt76t" podStartSLOduration=130.487243329 podStartE2EDuration="2m10.487243329s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.487167567 +0000 UTC m=+156.625727862" watchObservedRunningTime="2025-09-30 13:38:04.487243329 +0000 UTC m=+156.625803614" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.560866 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" podStartSLOduration=130.560846677 podStartE2EDuration="2m10.560846677s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.559953177 +0000 UTC m=+156.698513462" watchObservedRunningTime="2025-09-30 13:38:04.560846677 +0000 UTC m=+156.699406962" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.582179 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.582944 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.08292695 +0000 UTC m=+157.221487235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.591620 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-w2tnd" podStartSLOduration=130.591583471 podStartE2EDuration="2m10.591583471s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.591203483 +0000 UTC m=+156.729763778" watchObservedRunningTime="2025-09-30 13:38:04.591583471 +0000 UTC m=+156.730143756" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.627327 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" podStartSLOduration=130.627284161 podStartE2EDuration="2m10.627284161s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.624948816 +0000 UTC m=+156.763509101" watchObservedRunningTime="2025-09-30 13:38:04.627284161 +0000 UTC m=+156.765844446" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.654505 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" podStartSLOduration=130.654478782 podStartE2EDuration="2m10.654478782s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:04.651490382 +0000 UTC m=+156.790050697" watchObservedRunningTime="2025-09-30 13:38:04.654478782 +0000 UTC m=+156.793039067" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.683707 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.684179 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.18415793 +0000 UTC m=+157.322718215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.785193 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.785796 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.28577816 +0000 UTC m=+157.424338515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.886507 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.886710 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.386681593 +0000 UTC m=+157.525241878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.886878 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.887346 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.387338098 +0000 UTC m=+157.525898383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.909754 4763 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.971202 4763 patch_prober.go:28] interesting pod/router-default-5444994796-gdlbd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:38:04 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Sep 30 13:38:04 crc kubenswrapper[4763]: [+]process-running ok Sep 30 13:38:04 crc kubenswrapper[4763]: healthz check failed Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.971274 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gdlbd" podUID="d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.988840 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.988989 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.488969818 +0000 UTC m=+157.627530113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:04 crc kubenswrapper[4763]: I0930 13:38:04.989266 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:04 crc kubenswrapper[4763]: E0930 13:38:04.989639 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.489628343 +0000 UTC m=+157.628188628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.089937 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:05 crc kubenswrapper[4763]: E0930 13:38:05.090133 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.590106576 +0000 UTC m=+157.728666861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.090310 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:05 crc kubenswrapper[4763]: E0930 13:38:05.090670 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.590661949 +0000 UTC m=+157.729222234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.191128 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:05 crc kubenswrapper[4763]: E0930 13:38:05.191345 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.691310146 +0000 UTC m=+157.829870431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.191758 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:05 crc kubenswrapper[4763]: E0930 13:38:05.192183 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.692169946 +0000 UTC m=+157.830730351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.267993 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8gtwq" Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.292411 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:05 crc kubenswrapper[4763]: E0930 13:38:05.292680 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.792654179 +0000 UTC m=+157.931214514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.351608 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-59chv" event={"ID":"97702a69-e0ad-47b9-b8b7-d32fadc9185e","Type":"ContainerStarted","Data":"a42e17f8dfeed4b3d6cf5bb9fa0f8dc5459444d100082b6cff1cafc76f3de2c4"} Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.351667 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-59chv" event={"ID":"97702a69-e0ad-47b9-b8b7-d32fadc9185e","Type":"ContainerStarted","Data":"d828e1c797ccf66ddb873f39d6d8f2ec15ae2d2b79a45b6b3268f29cc321351a"} Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.360084 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.360584 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xbmq9" Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.394546 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:05 crc kubenswrapper[4763]: E0930 13:38:05.395065 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.895050016 +0000 UTC m=+158.033610311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.415626 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-59chv" podStartSLOduration=9.415581533 podStartE2EDuration="9.415581533s" podCreationTimestamp="2025-09-30 13:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:05.384591704 +0000 UTC m=+157.523151989" watchObservedRunningTime="2025-09-30 13:38:05.415581533 +0000 UTC m=+157.554141818" Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.496679 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:05 crc kubenswrapper[4763]: E0930 13:38:05.497127 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:38:05.997112596 +0000 UTC m=+158.135672871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.598946 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:05 crc kubenswrapper[4763]: E0930 13:38:05.599321 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:38:06.099310599 +0000 UTC m=+158.237870884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jmpjx" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.636424 4763 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T13:38:04.909790609Z","Handler":null,"Name":""} Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.680205 4763 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.680249 4763 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.701095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.798204 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.802794 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.826097 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.826138 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.857862 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jmpjx\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.966190 4763 patch_prober.go:28] interesting pod/router-default-5444994796-gdlbd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:38:05 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Sep 30 13:38:05 crc kubenswrapper[4763]: [+]process-running ok Sep 30 13:38:05 crc kubenswrapper[4763]: healthz check failed Sep 30 13:38:05 crc kubenswrapper[4763]: I0930 13:38:05.966282 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gdlbd" podUID="d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.059843 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.059901 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.086230 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6svmn"] Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.087120 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.089307 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.101144 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6svmn"] Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.106434 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khsj2\" (UniqueName: \"kubernetes.io/projected/783a0279-c32d-4085-9274-6291b7544803-kube-api-access-khsj2\") pod \"community-operators-6svmn\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.106547 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-utilities\") pod \"community-operators-6svmn\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.106566 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-catalog-content\") pod \"community-operators-6svmn\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.159539 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.207691 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-utilities\") pod \"community-operators-6svmn\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.208064 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-catalog-content\") pod \"community-operators-6svmn\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.208147 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khsj2\" (UniqueName: \"kubernetes.io/projected/783a0279-c32d-4085-9274-6291b7544803-kube-api-access-khsj2\") pod \"community-operators-6svmn\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.208510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-utilities\") pod \"community-operators-6svmn\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.208580 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-catalog-content\") pod \"community-operators-6svmn\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.228314 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khsj2\" (UniqueName: \"kubernetes.io/projected/783a0279-c32d-4085-9274-6291b7544803-kube-api-access-khsj2\") pod \"community-operators-6svmn\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.292228 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j5ns2"] Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.293182 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.297518 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.310854 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-catalog-content\") pod \"certified-operators-j5ns2\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.310937 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4kvs\" (UniqueName: \"kubernetes.io/projected/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-kube-api-access-g4kvs\") pod \"certified-operators-j5ns2\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.310965 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-utilities\") pod \"certified-operators-j5ns2\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.312399 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j5ns2"] Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.360717 4763 generic.go:334] "Generic (PLEG): container finished" podID="6a540445-8589-4437-b134-38ba9d38faf0" containerID="9ead3cce3b8bef2eff1fd9882d990535e94a6bcd8889b85de032de55a378c287" exitCode=0 Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.360764 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" event={"ID":"6a540445-8589-4437-b134-38ba9d38faf0","Type":"ContainerDied","Data":"9ead3cce3b8bef2eff1fd9882d990535e94a6bcd8889b85de032de55a378c287"} Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.362620 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jmpjx"] Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.403110 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.412160 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-catalog-content\") pod \"certified-operators-j5ns2\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.412317 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4kvs\" (UniqueName: \"kubernetes.io/projected/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-kube-api-access-g4kvs\") pod \"certified-operators-j5ns2\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.412374 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-utilities\") pod \"certified-operators-j5ns2\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.412972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-catalog-content\") pod \"certified-operators-j5ns2\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.413552 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-utilities\") pod \"certified-operators-j5ns2\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.434725 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4kvs\" (UniqueName: \"kubernetes.io/projected/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-kube-api-access-g4kvs\") pod \"certified-operators-j5ns2\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.500562 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.501191 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-th5tc"] Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.502433 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.504834 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-th5tc"] Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.514754 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45rfn\" (UniqueName: \"kubernetes.io/projected/7aa0d48c-e676-4d10-bc96-616a4730715d-kube-api-access-45rfn\") pod \"community-operators-th5tc\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.514900 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-utilities\") pod \"community-operators-th5tc\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.514988 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-catalog-content\") pod \"community-operators-th5tc\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.615013 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.616233 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45rfn\" (UniqueName: \"kubernetes.io/projected/7aa0d48c-e676-4d10-bc96-616a4730715d-kube-api-access-45rfn\") pod \"community-operators-th5tc\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.616316 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-utilities\") pod \"community-operators-th5tc\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.616816 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-catalog-content\") pod \"community-operators-th5tc\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.617004 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-utilities\") pod \"community-operators-th5tc\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.617082 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-catalog-content\") pod \"community-operators-th5tc\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.638184 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45rfn\" (UniqueName: \"kubernetes.io/projected/7aa0d48c-e676-4d10-bc96-616a4730715d-kube-api-access-45rfn\") pod \"community-operators-th5tc\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.657088 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6svmn"] Sep 30 13:38:06 crc kubenswrapper[4763]: W0930 13:38:06.669482 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783a0279_c32d_4085_9274_6291b7544803.slice/crio-26d1e419cbe94bb5a95e50e00f6b565a22bb8417090c864d662213fb39c5c6d1 WatchSource:0}: Error finding container 26d1e419cbe94bb5a95e50e00f6b565a22bb8417090c864d662213fb39c5c6d1: Status 404 returned error can't find the container with id 26d1e419cbe94bb5a95e50e00f6b565a22bb8417090c864d662213fb39c5c6d1 Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.684543 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-snds8"] Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.685740 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.697464 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-snds8"] Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.718462 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-catalog-content\") pod \"certified-operators-snds8\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.718666 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-utilities\") pod \"certified-operators-snds8\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.718870 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pln2n\" (UniqueName: \"kubernetes.io/projected/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-kube-api-access-pln2n\") pod \"certified-operators-snds8\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.818473 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.819403 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-catalog-content\") pod \"certified-operators-snds8\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.819494 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-utilities\") pod \"certified-operators-snds8\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.819581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pln2n\" (UniqueName: \"kubernetes.io/projected/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-kube-api-access-pln2n\") pod \"certified-operators-snds8\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.820655 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-utilities\") pod \"certified-operators-snds8\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.820808 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-catalog-content\") pod \"certified-operators-snds8\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.832752 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j5ns2"] Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.844574 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pln2n\" (UniqueName: \"kubernetes.io/projected/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-kube-api-access-pln2n\") pod \"certified-operators-snds8\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.965557 4763 patch_prober.go:28] interesting pod/router-default-5444994796-gdlbd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:38:06 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Sep 30 13:38:06 crc kubenswrapper[4763]: [+]process-running ok Sep 30 13:38:06 crc kubenswrapper[4763]: healthz check failed Sep 30 13:38:06 crc kubenswrapper[4763]: I0930 13:38:06.965632 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gdlbd" podUID="d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.010499 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.043161 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-th5tc"] Sep 30 13:38:07 crc kubenswrapper[4763]: W0930 13:38:07.052684 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa0d48c_e676_4d10_bc96_616a4730715d.slice/crio-4453ced9808998240425dba0a30251806cc3b6e9bb77bae4f7a0bbdcafc2ca25 WatchSource:0}: Error finding container 4453ced9808998240425dba0a30251806cc3b6e9bb77bae4f7a0bbdcafc2ca25: Status 404 returned error can't find the container with id 4453ced9808998240425dba0a30251806cc3b6e9bb77bae4f7a0bbdcafc2ca25 Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.207886 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-snds8"] Sep 30 13:38:07 crc kubenswrapper[4763]: W0930 13:38:07.210928 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c12b612_76e8_4dc5_960e_8d6af4e5d1d6.slice/crio-0e73aa36ad0fb6019fb555e88f1d9bd2240cd0621fe44eb5e0fa88f308d11199 WatchSource:0}: Error finding container 0e73aa36ad0fb6019fb555e88f1d9bd2240cd0621fe44eb5e0fa88f308d11199: Status 404 returned error can't find the container with id 0e73aa36ad0fb6019fb555e88f1d9bd2240cd0621fe44eb5e0fa88f308d11199 Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.368850 4763 generic.go:334] "Generic (PLEG): container finished" podID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerID="6d0a1494b4a8dc25281fa509f6c87e06193a5e2f06544f75e64c3c174fae6c8b" exitCode=0 Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.368966 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th5tc" event={"ID":"7aa0d48c-e676-4d10-bc96-616a4730715d","Type":"ContainerDied","Data":"6d0a1494b4a8dc25281fa509f6c87e06193a5e2f06544f75e64c3c174fae6c8b"} Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.368994 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th5tc" event={"ID":"7aa0d48c-e676-4d10-bc96-616a4730715d","Type":"ContainerStarted","Data":"4453ced9808998240425dba0a30251806cc3b6e9bb77bae4f7a0bbdcafc2ca25"} Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.370634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" event={"ID":"5b970ab9-2ae4-48ea-a4a2-db0e890a156a","Type":"ContainerStarted","Data":"db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64"} Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.370696 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" event={"ID":"5b970ab9-2ae4-48ea-a4a2-db0e890a156a","Type":"ContainerStarted","Data":"0201642ab442770a1bbcdec4f0c1acc3aab597810d91687239492fb9cefa0344"} Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.370716 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.371249 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.372297 4763 generic.go:334] "Generic (PLEG): container finished" podID="783a0279-c32d-4085-9274-6291b7544803" containerID="1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c" exitCode=0 Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.372363 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6svmn" event={"ID":"783a0279-c32d-4085-9274-6291b7544803","Type":"ContainerDied","Data":"1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c"} Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.372392 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6svmn" event={"ID":"783a0279-c32d-4085-9274-6291b7544803","Type":"ContainerStarted","Data":"26d1e419cbe94bb5a95e50e00f6b565a22bb8417090c864d662213fb39c5c6d1"} Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.373945 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snds8" event={"ID":"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6","Type":"ContainerStarted","Data":"0e73aa36ad0fb6019fb555e88f1d9bd2240cd0621fe44eb5e0fa88f308d11199"} Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.376522 4763 generic.go:334] "Generic (PLEG): container finished" podID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerID="66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1" exitCode=0 Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.376559 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j5ns2" event={"ID":"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb","Type":"ContainerDied","Data":"66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1"} Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.376621 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j5ns2" event={"ID":"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb","Type":"ContainerStarted","Data":"9e94679ac3cced12a5a85d8138d3948709eb3aaa3ebb1ee2265ab8217cdf1c7a"} Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.569176 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.569977 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.570538 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" podStartSLOduration=133.570522366 podStartE2EDuration="2m13.570522366s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:07.565139831 +0000 UTC m=+159.703700116" watchObservedRunningTime="2025-09-30 13:38:07.570522366 +0000 UTC m=+159.709082651" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.572377 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.573048 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.579372 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.696308 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.696429 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.707243 4763 patch_prober.go:28] interesting pod/console-f9d7485db-p5rvt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.707303 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-p5rvt" podUID="cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.731407 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15335c39-01fa-41e8-a0c5-8c59033f818b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15335c39-01fa-41e8-a0c5-8c59033f818b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.731614 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15335c39-01fa-41e8-a0c5-8c59033f818b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15335c39-01fa-41e8-a0c5-8c59033f818b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.736558 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.747028 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.768610 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.782859 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-6ts49" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.833047 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a540445-8589-4437-b134-38ba9d38faf0-secret-volume\") pod \"6a540445-8589-4437-b134-38ba9d38faf0\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.833213 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tplb6\" (UniqueName: \"kubernetes.io/projected/6a540445-8589-4437-b134-38ba9d38faf0-kube-api-access-tplb6\") pod \"6a540445-8589-4437-b134-38ba9d38faf0\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.833248 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a540445-8589-4437-b134-38ba9d38faf0-config-volume\") pod \"6a540445-8589-4437-b134-38ba9d38faf0\" (UID: \"6a540445-8589-4437-b134-38ba9d38faf0\") " Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.833497 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15335c39-01fa-41e8-a0c5-8c59033f818b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15335c39-01fa-41e8-a0c5-8c59033f818b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.833700 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15335c39-01fa-41e8-a0c5-8c59033f818b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15335c39-01fa-41e8-a0c5-8c59033f818b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.833714 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15335c39-01fa-41e8-a0c5-8c59033f818b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15335c39-01fa-41e8-a0c5-8c59033f818b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.834557 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a540445-8589-4437-b134-38ba9d38faf0-config-volume" (OuterVolumeSpecName: "config-volume") pod "6a540445-8589-4437-b134-38ba9d38faf0" (UID: "6a540445-8589-4437-b134-38ba9d38faf0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.849934 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a540445-8589-4437-b134-38ba9d38faf0-kube-api-access-tplb6" (OuterVolumeSpecName: "kube-api-access-tplb6") pod "6a540445-8589-4437-b134-38ba9d38faf0" (UID: "6a540445-8589-4437-b134-38ba9d38faf0"). InnerVolumeSpecName "kube-api-access-tplb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.863320 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a540445-8589-4437-b134-38ba9d38faf0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6a540445-8589-4437-b134-38ba9d38faf0" (UID: "6a540445-8589-4437-b134-38ba9d38faf0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.884581 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15335c39-01fa-41e8-a0c5-8c59033f818b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15335c39-01fa-41e8-a0c5-8c59033f818b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.928455 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.935169 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tplb6\" (UniqueName: \"kubernetes.io/projected/6a540445-8589-4437-b134-38ba9d38faf0-kube-api-access-tplb6\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.935210 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a540445-8589-4437-b134-38ba9d38faf0-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.935219 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a540445-8589-4437-b134-38ba9d38faf0-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.967322 4763 patch_prober.go:28] interesting pod/router-default-5444994796-gdlbd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:38:07 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Sep 30 13:38:07 crc kubenswrapper[4763]: [+]process-running ok Sep 30 13:38:07 crc kubenswrapper[4763]: healthz check failed Sep 30 13:38:07 crc kubenswrapper[4763]: I0930 13:38:07.967387 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gdlbd" podUID="d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.187045 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 13:38:08 crc kubenswrapper[4763]: W0930 13:38:08.197427 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod15335c39_01fa_41e8_a0c5_8c59033f818b.slice/crio-a2f6af8565ebe07efd886c49002318e065bde33eaf24af22f950e80a5d416295 WatchSource:0}: Error finding container a2f6af8565ebe07efd886c49002318e065bde33eaf24af22f950e80a5d416295: Status 404 returned error can't find the container with id a2f6af8565ebe07efd886c49002318e065bde33eaf24af22f950e80a5d416295 Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.284719 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7dh7w"] Sep 30 13:38:08 crc kubenswrapper[4763]: E0930 13:38:08.284953 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a540445-8589-4437-b134-38ba9d38faf0" containerName="collect-profiles" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.284965 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a540445-8589-4437-b134-38ba9d38faf0" containerName="collect-profiles" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.285059 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a540445-8589-4437-b134-38ba9d38faf0" containerName="collect-profiles" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.285789 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.291483 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.296391 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dh7w"] Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.386573 4763 generic.go:334] "Generic (PLEG): container finished" podID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerID="1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b" exitCode=0 Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.386686 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snds8" event={"ID":"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6","Type":"ContainerDied","Data":"1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b"} Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.388422 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15335c39-01fa-41e8-a0c5-8c59033f818b","Type":"ContainerStarted","Data":"a2f6af8565ebe07efd886c49002318e065bde33eaf24af22f950e80a5d416295"} Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.393455 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.393988 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw" event={"ID":"6a540445-8589-4437-b134-38ba9d38faf0","Type":"ContainerDied","Data":"6090fa3f0e0b9e778f501308fffb3585a5c9d2c34daf2cae207338f9df4bf39b"} Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.394022 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6090fa3f0e0b9e778f501308fffb3585a5c9d2c34daf2cae207338f9df4bf39b" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.444087 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-catalog-content\") pod \"redhat-marketplace-7dh7w\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.444221 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-utilities\") pod \"redhat-marketplace-7dh7w\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.444289 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chk96\" (UniqueName: \"kubernetes.io/projected/3fe99667-cf30-4112-ae34-a5bdccbfc24f-kube-api-access-chk96\") pod \"redhat-marketplace-7dh7w\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.547025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-utilities\") pod \"redhat-marketplace-7dh7w\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.547173 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chk96\" (UniqueName: \"kubernetes.io/projected/3fe99667-cf30-4112-ae34-a5bdccbfc24f-kube-api-access-chk96\") pod \"redhat-marketplace-7dh7w\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.547203 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-catalog-content\") pod \"redhat-marketplace-7dh7w\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.548095 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-catalog-content\") pod \"redhat-marketplace-7dh7w\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.549572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-utilities\") pod \"redhat-marketplace-7dh7w\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.580813 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chk96\" (UniqueName: \"kubernetes.io/projected/3fe99667-cf30-4112-ae34-a5bdccbfc24f-kube-api-access-chk96\") pod \"redhat-marketplace-7dh7w\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.619360 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.687456 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ztpwx"] Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.688873 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.725637 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztpwx"] Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.861992 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-catalog-content\") pod \"redhat-marketplace-ztpwx\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.862109 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-utilities\") pod \"redhat-marketplace-ztpwx\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.862129 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tpfq\" (UniqueName: \"kubernetes.io/projected/98083656-1422-49e1-a791-ed2ae6804df5-kube-api-access-5tpfq\") pod \"redhat-marketplace-ztpwx\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.909540 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dh7w"] Sep 30 13:38:08 crc kubenswrapper[4763]: W0930 13:38:08.952510 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe99667_cf30_4112_ae34_a5bdccbfc24f.slice/crio-d872bec5fe4f7d1e14b93c270d7010ed18826c54d51d2ab96ee8d81540796e00 WatchSource:0}: Error finding container d872bec5fe4f7d1e14b93c270d7010ed18826c54d51d2ab96ee8d81540796e00: Status 404 returned error can't find the container with id d872bec5fe4f7d1e14b93c270d7010ed18826c54d51d2ab96ee8d81540796e00 Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.963412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-catalog-content\") pod \"redhat-marketplace-ztpwx\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.963524 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-utilities\") pod \"redhat-marketplace-ztpwx\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.963550 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tpfq\" (UniqueName: \"kubernetes.io/projected/98083656-1422-49e1-a791-ed2ae6804df5-kube-api-access-5tpfq\") pod \"redhat-marketplace-ztpwx\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.965810 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-utilities\") pod \"redhat-marketplace-ztpwx\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.965847 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-catalog-content\") pod \"redhat-marketplace-ztpwx\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.966579 4763 patch_prober.go:28] interesting pod/router-default-5444994796-gdlbd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:38:08 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Sep 30 13:38:08 crc kubenswrapper[4763]: [+]process-running ok Sep 30 13:38:08 crc kubenswrapper[4763]: healthz check failed Sep 30 13:38:08 crc kubenswrapper[4763]: I0930 13:38:08.966671 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gdlbd" podUID="d97392e5-7108-4a1f-8b2d-e6c8f7bb42d3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.002969 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tpfq\" (UniqueName: \"kubernetes.io/projected/98083656-1422-49e1-a791-ed2ae6804df5-kube-api-access-5tpfq\") pod \"redhat-marketplace-ztpwx\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.030775 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.278788 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-f228d container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.278833 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-f228d container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.278895 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f228d" podUID="eb3a4cfd-1db3-488c-ba4c-bd04add6bd05" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.278842 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-f228d" podUID="eb3a4cfd-1db3-488c-ba4c-bd04add6bd05" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.290107 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5qd4l"] Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.291882 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.294501 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.301928 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qd4l"] Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.405485 4763 generic.go:334] "Generic (PLEG): container finished" podID="15335c39-01fa-41e8-a0c5-8c59033f818b" containerID="926de14935c874131bfadfda325ab791dd6aa6a911bdc5e0a0e5b3c1d0e1d68d" exitCode=0 Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.405877 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15335c39-01fa-41e8-a0c5-8c59033f818b","Type":"ContainerDied","Data":"926de14935c874131bfadfda325ab791dd6aa6a911bdc5e0a0e5b3c1d0e1d68d"} Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.413244 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dh7w" event={"ID":"3fe99667-cf30-4112-ae34-a5bdccbfc24f","Type":"ContainerStarted","Data":"d872bec5fe4f7d1e14b93c270d7010ed18826c54d51d2ab96ee8d81540796e00"} Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.470789 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-utilities\") pod \"redhat-operators-5qd4l\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.470932 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr5zg\" (UniqueName: \"kubernetes.io/projected/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-kube-api-access-zr5zg\") pod \"redhat-operators-5qd4l\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.470989 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-catalog-content\") pod \"redhat-operators-5qd4l\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.572643 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr5zg\" (UniqueName: \"kubernetes.io/projected/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-kube-api-access-zr5zg\") pod \"redhat-operators-5qd4l\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.572777 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-catalog-content\") pod \"redhat-operators-5qd4l\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.572818 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-utilities\") pod \"redhat-operators-5qd4l\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.574518 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-catalog-content\") pod \"redhat-operators-5qd4l\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.575179 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-utilities\") pod \"redhat-operators-5qd4l\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.598227 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztpwx"] Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.600761 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr5zg\" (UniqueName: \"kubernetes.io/projected/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-kube-api-access-zr5zg\") pod \"redhat-operators-5qd4l\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: W0930 13:38:09.615389 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98083656_1422_49e1_a791_ed2ae6804df5.slice/crio-3699ef6c154e4ef447ba4ea0867327b1548718f7a824a09c6cac64b0cdb0ac16 WatchSource:0}: Error finding container 3699ef6c154e4ef447ba4ea0867327b1548718f7a824a09c6cac64b0cdb0ac16: Status 404 returned error can't find the container with id 3699ef6c154e4ef447ba4ea0867327b1548718f7a824a09c6cac64b0cdb0ac16 Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.618532 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.688493 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bbzmc"] Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.689772 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.705931 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bbzmc"] Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.887468 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-utilities\") pod \"redhat-operators-bbzmc\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.887977 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-catalog-content\") pod \"redhat-operators-bbzmc\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.888113 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cv4d\" (UniqueName: \"kubernetes.io/projected/69483f45-9432-41f3-81e3-eb02b8b1fb00-kube-api-access-6cv4d\") pod \"redhat-operators-bbzmc\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.962973 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.968991 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.992302 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-utilities\") pod \"redhat-operators-bbzmc\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.992465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-catalog-content\") pod \"redhat-operators-bbzmc\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.992538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cv4d\" (UniqueName: \"kubernetes.io/projected/69483f45-9432-41f3-81e3-eb02b8b1fb00-kube-api-access-6cv4d\") pod \"redhat-operators-bbzmc\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.993409 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-utilities\") pod \"redhat-operators-bbzmc\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:09 crc kubenswrapper[4763]: I0930 13:38:09.993586 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-catalog-content\") pod \"redhat-operators-bbzmc\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.055093 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cv4d\" (UniqueName: \"kubernetes.io/projected/69483f45-9432-41f3-81e3-eb02b8b1fb00-kube-api-access-6cv4d\") pod \"redhat-operators-bbzmc\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.141330 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qd4l"] Sep 30 13:38:10 crc kubenswrapper[4763]: W0930 13:38:10.221770 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd23f5ffd_eee7_4ca7_b13f_c1175ab7eb9a.slice/crio-3fdf753f6ea5040413cbee858409cca2704f7d226f377e1996ed6858d9379567 WatchSource:0}: Error finding container 3fdf753f6ea5040413cbee858409cca2704f7d226f377e1996ed6858d9379567: Status 404 returned error can't find the container with id 3fdf753f6ea5040413cbee858409cca2704f7d226f377e1996ed6858d9379567 Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.315889 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.483291 4763 generic.go:334] "Generic (PLEG): container finished" podID="98083656-1422-49e1-a791-ed2ae6804df5" containerID="4c71b5c35b9c32b960fcfe9ebbdb726b0da9f171dbf5563dbcc7466188934b9c" exitCode=0 Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.483322 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztpwx" event={"ID":"98083656-1422-49e1-a791-ed2ae6804df5","Type":"ContainerDied","Data":"4c71b5c35b9c32b960fcfe9ebbdb726b0da9f171dbf5563dbcc7466188934b9c"} Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.483424 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztpwx" event={"ID":"98083656-1422-49e1-a791-ed2ae6804df5","Type":"ContainerStarted","Data":"3699ef6c154e4ef447ba4ea0867327b1548718f7a824a09c6cac64b0cdb0ac16"} Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.491715 4763 generic.go:334] "Generic (PLEG): container finished" podID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerID="37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0" exitCode=0 Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.529510 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dh7w" event={"ID":"3fe99667-cf30-4112-ae34-a5bdccbfc24f","Type":"ContainerDied","Data":"37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0"} Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.529648 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qd4l" event={"ID":"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a","Type":"ContainerStarted","Data":"3fdf753f6ea5040413cbee858409cca2704f7d226f377e1996ed6858d9379567"} Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.531848 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gdlbd" Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.757698 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bbzmc"] Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.927951 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.932403 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15335c39-01fa-41e8-a0c5-8c59033f818b-kube-api-access\") pod \"15335c39-01fa-41e8-a0c5-8c59033f818b\" (UID: \"15335c39-01fa-41e8-a0c5-8c59033f818b\") " Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.932464 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15335c39-01fa-41e8-a0c5-8c59033f818b-kubelet-dir\") pod \"15335c39-01fa-41e8-a0c5-8c59033f818b\" (UID: \"15335c39-01fa-41e8-a0c5-8c59033f818b\") " Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.932772 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15335c39-01fa-41e8-a0c5-8c59033f818b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "15335c39-01fa-41e8-a0c5-8c59033f818b" (UID: "15335c39-01fa-41e8-a0c5-8c59033f818b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:38:10 crc kubenswrapper[4763]: I0930 13:38:10.955214 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15335c39-01fa-41e8-a0c5-8c59033f818b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "15335c39-01fa-41e8-a0c5-8c59033f818b" (UID: "15335c39-01fa-41e8-a0c5-8c59033f818b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.034325 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15335c39-01fa-41e8-a0c5-8c59033f818b-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.034353 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15335c39-01fa-41e8-a0c5-8c59033f818b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.080771 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 13:38:11 crc kubenswrapper[4763]: E0930 13:38:11.081056 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15335c39-01fa-41e8-a0c5-8c59033f818b" containerName="pruner" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.081068 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="15335c39-01fa-41e8-a0c5-8c59033f818b" containerName="pruner" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.081162 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="15335c39-01fa-41e8-a0c5-8c59033f818b" containerName="pruner" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.083963 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.086803 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.086907 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.086962 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.140426 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/716c94be-c2b0-4523-87a5-505ab53bd786-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"716c94be-c2b0-4523-87a5-505ab53bd786\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.140637 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/716c94be-c2b0-4523-87a5-505ab53bd786-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"716c94be-c2b0-4523-87a5-505ab53bd786\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.242345 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/716c94be-c2b0-4523-87a5-505ab53bd786-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"716c94be-c2b0-4523-87a5-505ab53bd786\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.246386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/716c94be-c2b0-4523-87a5-505ab53bd786-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"716c94be-c2b0-4523-87a5-505ab53bd786\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.246969 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/716c94be-c2b0-4523-87a5-505ab53bd786-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"716c94be-c2b0-4523-87a5-505ab53bd786\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.261910 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/716c94be-c2b0-4523-87a5-505ab53bd786-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"716c94be-c2b0-4523-87a5-505ab53bd786\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.453863 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.537015 4763 generic.go:334] "Generic (PLEG): container finished" podID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerID="682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97" exitCode=0 Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.537109 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qd4l" event={"ID":"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a","Type":"ContainerDied","Data":"682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97"} Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.548439 4763 generic.go:334] "Generic (PLEG): container finished" podID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerID="427e9b6332ef5f5c3c900f35a80eae4c1640e4c183375641eb696e9d15323b62" exitCode=0 Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.548532 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbzmc" event={"ID":"69483f45-9432-41f3-81e3-eb02b8b1fb00","Type":"ContainerDied","Data":"427e9b6332ef5f5c3c900f35a80eae4c1640e4c183375641eb696e9d15323b62"} Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.548560 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbzmc" event={"ID":"69483f45-9432-41f3-81e3-eb02b8b1fb00","Type":"ContainerStarted","Data":"1de9d49a8787fd34f6e9fac040a5dbc3e6214c70daba79822bd0be4bc9602253"} Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.553371 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.555227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15335c39-01fa-41e8-a0c5-8c59033f818b","Type":"ContainerDied","Data":"a2f6af8565ebe07efd886c49002318e065bde33eaf24af22f950e80a5d416295"} Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.555264 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2f6af8565ebe07efd886c49002318e065bde33eaf24af22f950e80a5d416295" Sep 30 13:38:11 crc kubenswrapper[4763]: I0930 13:38:11.746771 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 13:38:11 crc kubenswrapper[4763]: W0930 13:38:11.764438 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod716c94be_c2b0_4523_87a5_505ab53bd786.slice/crio-14618e0c9ca6b82e119adbce5cd9f14d621fac76abc7e3a3016f6d8bfa967bd6 WatchSource:0}: Error finding container 14618e0c9ca6b82e119adbce5cd9f14d621fac76abc7e3a3016f6d8bfa967bd6: Status 404 returned error can't find the container with id 14618e0c9ca6b82e119adbce5cd9f14d621fac76abc7e3a3016f6d8bfa967bd6 Sep 30 13:38:12 crc kubenswrapper[4763]: I0930 13:38:12.572497 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"716c94be-c2b0-4523-87a5-505ab53bd786","Type":"ContainerStarted","Data":"a9dfbc1dd6a9b9122e16bc99d042705a1fe1940a1661c3eda579cd3d3bb5f115"} Sep 30 13:38:12 crc kubenswrapper[4763]: I0930 13:38:12.572900 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"716c94be-c2b0-4523-87a5-505ab53bd786","Type":"ContainerStarted","Data":"14618e0c9ca6b82e119adbce5cd9f14d621fac76abc7e3a3016f6d8bfa967bd6"} Sep 30 13:38:13 crc kubenswrapper[4763]: I0930 13:38:13.579874 4763 generic.go:334] "Generic (PLEG): container finished" podID="716c94be-c2b0-4523-87a5-505ab53bd786" containerID="a9dfbc1dd6a9b9122e16bc99d042705a1fe1940a1661c3eda579cd3d3bb5f115" exitCode=0 Sep 30 13:38:13 crc kubenswrapper[4763]: I0930 13:38:13.580246 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"716c94be-c2b0-4523-87a5-505ab53bd786","Type":"ContainerDied","Data":"a9dfbc1dd6a9b9122e16bc99d042705a1fe1940a1661c3eda579cd3d3bb5f115"} Sep 30 13:38:14 crc kubenswrapper[4763]: I0930 13:38:14.528629 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bdwgk" Sep 30 13:38:16 crc kubenswrapper[4763]: I0930 13:38:16.712542 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:38:16 crc kubenswrapper[4763]: I0930 13:38:16.724531 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/394a12b5-37c3-4933-af17-71f5c84ec2fa-metrics-certs\") pod \"network-metrics-daemon-rggrv\" (UID: \"394a12b5-37c3-4933-af17-71f5c84ec2fa\") " pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:38:16 crc kubenswrapper[4763]: I0930 13:38:16.831803 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rggrv" Sep 30 13:38:17 crc kubenswrapper[4763]: I0930 13:38:17.701220 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:38:17 crc kubenswrapper[4763]: I0930 13:38:17.706375 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:38:19 crc kubenswrapper[4763]: I0930 13:38:19.289086 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-f228d" Sep 30 13:38:20 crc kubenswrapper[4763]: I0930 13:38:20.000208 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:38:20 crc kubenswrapper[4763]: I0930 13:38:20.097069 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/716c94be-c2b0-4523-87a5-505ab53bd786-kubelet-dir\") pod \"716c94be-c2b0-4523-87a5-505ab53bd786\" (UID: \"716c94be-c2b0-4523-87a5-505ab53bd786\") " Sep 30 13:38:20 crc kubenswrapper[4763]: I0930 13:38:20.097152 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/716c94be-c2b0-4523-87a5-505ab53bd786-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "716c94be-c2b0-4523-87a5-505ab53bd786" (UID: "716c94be-c2b0-4523-87a5-505ab53bd786"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:38:20 crc kubenswrapper[4763]: I0930 13:38:20.097237 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/716c94be-c2b0-4523-87a5-505ab53bd786-kube-api-access\") pod \"716c94be-c2b0-4523-87a5-505ab53bd786\" (UID: \"716c94be-c2b0-4523-87a5-505ab53bd786\") " Sep 30 13:38:20 crc kubenswrapper[4763]: I0930 13:38:20.097698 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/716c94be-c2b0-4523-87a5-505ab53bd786-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:20 crc kubenswrapper[4763]: I0930 13:38:20.104357 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716c94be-c2b0-4523-87a5-505ab53bd786-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "716c94be-c2b0-4523-87a5-505ab53bd786" (UID: "716c94be-c2b0-4523-87a5-505ab53bd786"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:38:20 crc kubenswrapper[4763]: I0930 13:38:20.198995 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/716c94be-c2b0-4523-87a5-505ab53bd786-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:20 crc kubenswrapper[4763]: I0930 13:38:20.624127 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"716c94be-c2b0-4523-87a5-505ab53bd786","Type":"ContainerDied","Data":"14618e0c9ca6b82e119adbce5cd9f14d621fac76abc7e3a3016f6d8bfa967bd6"} Sep 30 13:38:20 crc kubenswrapper[4763]: I0930 13:38:20.624172 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14618e0c9ca6b82e119adbce5cd9f14d621fac76abc7e3a3016f6d8bfa967bd6" Sep 30 13:38:20 crc kubenswrapper[4763]: I0930 13:38:20.624191 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:38:26 crc kubenswrapper[4763]: I0930 13:38:26.168394 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:38:30 crc kubenswrapper[4763]: E0930 13:38:30.336290 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 13:38:30 crc kubenswrapper[4763]: E0930 13:38:30.337947 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-khsj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6svmn_openshift-marketplace(783a0279-c32d-4085-9274-6291b7544803): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:38:30 crc kubenswrapper[4763]: E0930 13:38:30.339180 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6svmn" podUID="783a0279-c32d-4085-9274-6291b7544803" Sep 30 13:38:31 crc kubenswrapper[4763]: E0930 13:38:31.480168 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6svmn" podUID="783a0279-c32d-4085-9274-6291b7544803" Sep 30 13:38:31 crc kubenswrapper[4763]: E0930 13:38:31.542029 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 13:38:31 crc kubenswrapper[4763]: E0930 13:38:31.542319 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-45rfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-th5tc_openshift-marketplace(7aa0d48c-e676-4d10-bc96-616a4730715d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:38:31 crc kubenswrapper[4763]: E0930 13:38:31.543765 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-th5tc" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" Sep 30 13:38:32 crc kubenswrapper[4763]: E0930 13:38:32.381581 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 30 13:38:32 crc kubenswrapper[4763]: E0930 13:38:32.381786 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pln2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-snds8_openshift-marketplace(8c12b612-76e8-4dc5-960e-8d6af4e5d1d6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:38:32 crc kubenswrapper[4763]: E0930 13:38:32.383017 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-snds8" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" Sep 30 13:38:33 crc kubenswrapper[4763]: E0930 13:38:33.437309 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-th5tc" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" Sep 30 13:38:33 crc kubenswrapper[4763]: E0930 13:38:33.437382 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-snds8" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" Sep 30 13:38:33 crc kubenswrapper[4763]: E0930 13:38:33.553125 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 13:38:33 crc kubenswrapper[4763]: E0930 13:38:33.553851 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-chk96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7dh7w_openshift-marketplace(3fe99667-cf30-4112-ae34-a5bdccbfc24f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:38:33 crc kubenswrapper[4763]: E0930 13:38:33.555073 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7dh7w" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" Sep 30 13:38:33 crc kubenswrapper[4763]: E0930 13:38:33.598928 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 30 13:38:33 crc kubenswrapper[4763]: E0930 13:38:33.599102 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g4kvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-j5ns2_openshift-marketplace(b0ac82cc-de99-4b1b-a8f8-c36b5037cafb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:38:33 crc kubenswrapper[4763]: E0930 13:38:33.600458 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-j5ns2" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" Sep 30 13:38:36 crc kubenswrapper[4763]: I0930 13:38:36.060484 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:38:36 crc kubenswrapper[4763]: I0930 13:38:36.061043 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:38:36 crc kubenswrapper[4763]: E0930 13:38:36.237045 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7dh7w" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" Sep 30 13:38:36 crc kubenswrapper[4763]: E0930 13:38:36.237376 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-j5ns2" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" Sep 30 13:38:36 crc kubenswrapper[4763]: I0930 13:38:36.722646 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qd4l" event={"ID":"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a","Type":"ContainerStarted","Data":"c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b"} Sep 30 13:38:36 crc kubenswrapper[4763]: I0930 13:38:36.725614 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbzmc" event={"ID":"69483f45-9432-41f3-81e3-eb02b8b1fb00","Type":"ContainerStarted","Data":"c546156b5b05a65b9872a9b1d96a8cc4b7e3209893e976a3c5c0e31058767872"} Sep 30 13:38:36 crc kubenswrapper[4763]: I0930 13:38:36.726730 4763 generic.go:334] "Generic (PLEG): container finished" podID="98083656-1422-49e1-a791-ed2ae6804df5" containerID="54a1a72cd37817247a5dbcf18dd3cd1cdcd2232b193d99ccb0886e6549663e22" exitCode=0 Sep 30 13:38:36 crc kubenswrapper[4763]: I0930 13:38:36.726760 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztpwx" event={"ID":"98083656-1422-49e1-a791-ed2ae6804df5","Type":"ContainerDied","Data":"54a1a72cd37817247a5dbcf18dd3cd1cdcd2232b193d99ccb0886e6549663e22"} Sep 30 13:38:36 crc kubenswrapper[4763]: I0930 13:38:36.828176 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rggrv"] Sep 30 13:38:37 crc kubenswrapper[4763]: I0930 13:38:37.739671 4763 generic.go:334] "Generic (PLEG): container finished" podID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerID="c546156b5b05a65b9872a9b1d96a8cc4b7e3209893e976a3c5c0e31058767872" exitCode=0 Sep 30 13:38:37 crc kubenswrapper[4763]: I0930 13:38:37.739778 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbzmc" event={"ID":"69483f45-9432-41f3-81e3-eb02b8b1fb00","Type":"ContainerDied","Data":"c546156b5b05a65b9872a9b1d96a8cc4b7e3209893e976a3c5c0e31058767872"} Sep 30 13:38:37 crc kubenswrapper[4763]: I0930 13:38:37.743534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztpwx" event={"ID":"98083656-1422-49e1-a791-ed2ae6804df5","Type":"ContainerStarted","Data":"08e3dafe1dbe3c4951c0f167c433b19ceaa93bf4975909e15fe0a8055f558657"} Sep 30 13:38:37 crc kubenswrapper[4763]: I0930 13:38:37.747943 4763 generic.go:334] "Generic (PLEG): container finished" podID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerID="c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b" exitCode=0 Sep 30 13:38:37 crc kubenswrapper[4763]: I0930 13:38:37.748430 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qd4l" event={"ID":"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a","Type":"ContainerDied","Data":"c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b"} Sep 30 13:38:37 crc kubenswrapper[4763]: I0930 13:38:37.751888 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rggrv" event={"ID":"394a12b5-37c3-4933-af17-71f5c84ec2fa","Type":"ContainerStarted","Data":"91b0c1223c2834daade478be8df34f92d7311416e3c6cf1503253c1935dd0da0"} Sep 30 13:38:37 crc kubenswrapper[4763]: I0930 13:38:37.751965 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rggrv" event={"ID":"394a12b5-37c3-4933-af17-71f5c84ec2fa","Type":"ContainerStarted","Data":"c9b52f5dd5beb5d05d114f3e08d14c179a890b4e5c64326b578d5afe7fdba301"} Sep 30 13:38:37 crc kubenswrapper[4763]: I0930 13:38:37.751985 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rggrv" event={"ID":"394a12b5-37c3-4933-af17-71f5c84ec2fa","Type":"ContainerStarted","Data":"c35a23b2cc29808667d919235bdf5706619f0a30af6f28f53f7e109f0175d8ba"} Sep 30 13:38:37 crc kubenswrapper[4763]: I0930 13:38:37.789719 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rggrv" podStartSLOduration=163.789623115 podStartE2EDuration="2m43.789623115s" podCreationTimestamp="2025-09-30 13:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:38:37.789100502 +0000 UTC m=+189.927660787" watchObservedRunningTime="2025-09-30 13:38:37.789623115 +0000 UTC m=+189.928183400" Sep 30 13:38:37 crc kubenswrapper[4763]: I0930 13:38:37.837525 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ztpwx" podStartSLOduration=4.265848961 podStartE2EDuration="29.837494646s" podCreationTimestamp="2025-09-30 13:38:08 +0000 UTC" firstStartedPulling="2025-09-30 13:38:11.559639645 +0000 UTC m=+163.698199930" lastFinishedPulling="2025-09-30 13:38:37.13128533 +0000 UTC m=+189.269845615" observedRunningTime="2025-09-30 13:38:37.829428018 +0000 UTC m=+189.967988304" watchObservedRunningTime="2025-09-30 13:38:37.837494646 +0000 UTC m=+189.976054941" Sep 30 13:38:38 crc kubenswrapper[4763]: I0930 13:38:38.758228 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qd4l" event={"ID":"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a","Type":"ContainerStarted","Data":"7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424"} Sep 30 13:38:38 crc kubenswrapper[4763]: I0930 13:38:38.762287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbzmc" event={"ID":"69483f45-9432-41f3-81e3-eb02b8b1fb00","Type":"ContainerStarted","Data":"d96c44bf45c97b1ca2d2d13dc5d4b2bfe5415cbf6081f25bbfa2338034f5fa8a"} Sep 30 13:38:38 crc kubenswrapper[4763]: I0930 13:38:38.784335 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5qd4l" podStartSLOduration=3.034417698 podStartE2EDuration="29.784319579s" podCreationTimestamp="2025-09-30 13:38:09 +0000 UTC" firstStartedPulling="2025-09-30 13:38:11.538847832 +0000 UTC m=+163.677408117" lastFinishedPulling="2025-09-30 13:38:38.288749713 +0000 UTC m=+190.427309998" observedRunningTime="2025-09-30 13:38:38.780952711 +0000 UTC m=+190.919513036" watchObservedRunningTime="2025-09-30 13:38:38.784319579 +0000 UTC m=+190.922879854" Sep 30 13:38:38 crc kubenswrapper[4763]: I0930 13:38:38.806561 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bbzmc" podStartSLOduration=3.061645141 podStartE2EDuration="29.806532345s" podCreationTimestamp="2025-09-30 13:38:09 +0000 UTC" firstStartedPulling="2025-09-30 13:38:11.557957746 +0000 UTC m=+163.696518031" lastFinishedPulling="2025-09-30 13:38:38.30284495 +0000 UTC m=+190.441405235" observedRunningTime="2025-09-30 13:38:38.799139804 +0000 UTC m=+190.937700089" watchObservedRunningTime="2025-09-30 13:38:38.806532345 +0000 UTC m=+190.945092670" Sep 30 13:38:39 crc kubenswrapper[4763]: I0930 13:38:39.031738 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:39 crc kubenswrapper[4763]: I0930 13:38:39.031803 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:39 crc kubenswrapper[4763]: I0930 13:38:39.156132 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:39 crc kubenswrapper[4763]: I0930 13:38:39.619638 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:39 crc kubenswrapper[4763]: I0930 13:38:39.620050 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:40 crc kubenswrapper[4763]: I0930 13:38:40.000798 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ddx8p" Sep 30 13:38:40 crc kubenswrapper[4763]: I0930 13:38:40.316858 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:40 crc kubenswrapper[4763]: I0930 13:38:40.316936 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:40 crc kubenswrapper[4763]: I0930 13:38:40.659009 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5qd4l" podUID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerName="registry-server" probeResult="failure" output=< Sep 30 13:38:40 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Sep 30 13:38:40 crc kubenswrapper[4763]: > Sep 30 13:38:41 crc kubenswrapper[4763]: I0930 13:38:41.370548 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bbzmc" podUID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerName="registry-server" probeResult="failure" output=< Sep 30 13:38:41 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Sep 30 13:38:41 crc kubenswrapper[4763]: > Sep 30 13:38:43 crc kubenswrapper[4763]: I0930 13:38:43.791129 4763 generic.go:334] "Generic (PLEG): container finished" podID="783a0279-c32d-4085-9274-6291b7544803" containerID="3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d" exitCode=0 Sep 30 13:38:43 crc kubenswrapper[4763]: I0930 13:38:43.791208 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6svmn" event={"ID":"783a0279-c32d-4085-9274-6291b7544803","Type":"ContainerDied","Data":"3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d"} Sep 30 13:38:44 crc kubenswrapper[4763]: I0930 13:38:44.799879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6svmn" event={"ID":"783a0279-c32d-4085-9274-6291b7544803","Type":"ContainerStarted","Data":"8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4"} Sep 30 13:38:44 crc kubenswrapper[4763]: I0930 13:38:44.825533 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6svmn" podStartSLOduration=1.867709344 podStartE2EDuration="38.825508589s" podCreationTimestamp="2025-09-30 13:38:06 +0000 UTC" firstStartedPulling="2025-09-30 13:38:07.373367099 +0000 UTC m=+159.511927384" lastFinishedPulling="2025-09-30 13:38:44.331166334 +0000 UTC m=+196.469726629" observedRunningTime="2025-09-30 13:38:44.822820442 +0000 UTC m=+196.961380767" watchObservedRunningTime="2025-09-30 13:38:44.825508589 +0000 UTC m=+196.964068884" Sep 30 13:38:46 crc kubenswrapper[4763]: I0930 13:38:46.403646 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:46 crc kubenswrapper[4763]: I0930 13:38:46.404042 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:47 crc kubenswrapper[4763]: I0930 13:38:47.451549 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6svmn" podUID="783a0279-c32d-4085-9274-6291b7544803" containerName="registry-server" probeResult="failure" output=< Sep 30 13:38:47 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Sep 30 13:38:47 crc kubenswrapper[4763]: > Sep 30 13:38:49 crc kubenswrapper[4763]: I0930 13:38:49.110773 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:49 crc kubenswrapper[4763]: I0930 13:38:49.159465 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztpwx"] Sep 30 13:38:49 crc kubenswrapper[4763]: I0930 13:38:49.689393 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:49 crc kubenswrapper[4763]: I0930 13:38:49.762546 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:38:49 crc kubenswrapper[4763]: I0930 13:38:49.833882 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ztpwx" podUID="98083656-1422-49e1-a791-ed2ae6804df5" containerName="registry-server" containerID="cri-o://08e3dafe1dbe3c4951c0f167c433b19ceaa93bf4975909e15fe0a8055f558657" gracePeriod=2 Sep 30 13:38:50 crc kubenswrapper[4763]: I0930 13:38:50.386048 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:50 crc kubenswrapper[4763]: I0930 13:38:50.442225 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:50 crc kubenswrapper[4763]: I0930 13:38:50.846102 4763 generic.go:334] "Generic (PLEG): container finished" podID="98083656-1422-49e1-a791-ed2ae6804df5" containerID="08e3dafe1dbe3c4951c0f167c433b19ceaa93bf4975909e15fe0a8055f558657" exitCode=0 Sep 30 13:38:50 crc kubenswrapper[4763]: I0930 13:38:50.846326 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztpwx" event={"ID":"98083656-1422-49e1-a791-ed2ae6804df5","Type":"ContainerDied","Data":"08e3dafe1dbe3c4951c0f167c433b19ceaa93bf4975909e15fe0a8055f558657"} Sep 30 13:38:50 crc kubenswrapper[4763]: I0930 13:38:50.849843 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th5tc" event={"ID":"7aa0d48c-e676-4d10-bc96-616a4730715d","Type":"ContainerStarted","Data":"6b9f9cdcb7f59aa3c3b3b5c95294d7be1f4f9e842d77d774ce6e4868ce55085e"} Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.439013 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.587465 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-catalog-content\") pod \"98083656-1422-49e1-a791-ed2ae6804df5\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.587761 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-utilities\") pod \"98083656-1422-49e1-a791-ed2ae6804df5\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.587833 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tpfq\" (UniqueName: \"kubernetes.io/projected/98083656-1422-49e1-a791-ed2ae6804df5-kube-api-access-5tpfq\") pod \"98083656-1422-49e1-a791-ed2ae6804df5\" (UID: \"98083656-1422-49e1-a791-ed2ae6804df5\") " Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.590027 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-utilities" (OuterVolumeSpecName: "utilities") pod "98083656-1422-49e1-a791-ed2ae6804df5" (UID: "98083656-1422-49e1-a791-ed2ae6804df5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.597457 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98083656-1422-49e1-a791-ed2ae6804df5-kube-api-access-5tpfq" (OuterVolumeSpecName: "kube-api-access-5tpfq") pod "98083656-1422-49e1-a791-ed2ae6804df5" (UID: "98083656-1422-49e1-a791-ed2ae6804df5"). InnerVolumeSpecName "kube-api-access-5tpfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.606818 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98083656-1422-49e1-a791-ed2ae6804df5" (UID: "98083656-1422-49e1-a791-ed2ae6804df5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.690269 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.690309 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98083656-1422-49e1-a791-ed2ae6804df5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.690327 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tpfq\" (UniqueName: \"kubernetes.io/projected/98083656-1422-49e1-a791-ed2ae6804df5-kube-api-access-5tpfq\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.858430 4763 generic.go:334] "Generic (PLEG): container finished" podID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerID="703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057" exitCode=0 Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.858468 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snds8" event={"ID":"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6","Type":"ContainerDied","Data":"703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057"} Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.861304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztpwx" event={"ID":"98083656-1422-49e1-a791-ed2ae6804df5","Type":"ContainerDied","Data":"3699ef6c154e4ef447ba4ea0867327b1548718f7a824a09c6cac64b0cdb0ac16"} Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.861353 4763 scope.go:117] "RemoveContainer" containerID="08e3dafe1dbe3c4951c0f167c433b19ceaa93bf4975909e15fe0a8055f558657" Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.861474 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztpwx" Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.865560 4763 generic.go:334] "Generic (PLEG): container finished" podID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerID="b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467" exitCode=0 Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.865655 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j5ns2" event={"ID":"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb","Type":"ContainerDied","Data":"b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467"} Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.868672 4763 generic.go:334] "Generic (PLEG): container finished" podID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerID="6b9f9cdcb7f59aa3c3b3b5c95294d7be1f4f9e842d77d774ce6e4868ce55085e" exitCode=0 Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.868700 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th5tc" event={"ID":"7aa0d48c-e676-4d10-bc96-616a4730715d","Type":"ContainerDied","Data":"6b9f9cdcb7f59aa3c3b3b5c95294d7be1f4f9e842d77d774ce6e4868ce55085e"} Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.938650 4763 scope.go:117] "RemoveContainer" containerID="54a1a72cd37817247a5dbcf18dd3cd1cdcd2232b193d99ccb0886e6549663e22" Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.941490 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztpwx"] Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.944972 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztpwx"] Sep 30 13:38:51 crc kubenswrapper[4763]: I0930 13:38:51.959566 4763 scope.go:117] "RemoveContainer" containerID="4c71b5c35b9c32b960fcfe9ebbdb726b0da9f171dbf5563dbcc7466188934b9c" Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.503413 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98083656-1422-49e1-a791-ed2ae6804df5" path="/var/lib/kubelet/pods/98083656-1422-49e1-a791-ed2ae6804df5/volumes" Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.732759 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bbzmc"] Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.733029 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bbzmc" podUID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerName="registry-server" containerID="cri-o://d96c44bf45c97b1ca2d2d13dc5d4b2bfe5415cbf6081f25bbfa2338034f5fa8a" gracePeriod=2 Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.875846 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th5tc" event={"ID":"7aa0d48c-e676-4d10-bc96-616a4730715d","Type":"ContainerStarted","Data":"20070229a288e63a50c15b23bdfe0eaa968b1ec60cb91296a82fa0de1fcd386c"} Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.877806 4763 generic.go:334] "Generic (PLEG): container finished" podID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerID="d96c44bf45c97b1ca2d2d13dc5d4b2bfe5415cbf6081f25bbfa2338034f5fa8a" exitCode=0 Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.877860 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbzmc" event={"ID":"69483f45-9432-41f3-81e3-eb02b8b1fb00","Type":"ContainerDied","Data":"d96c44bf45c97b1ca2d2d13dc5d4b2bfe5415cbf6081f25bbfa2338034f5fa8a"} Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.880356 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snds8" event={"ID":"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6","Type":"ContainerStarted","Data":"c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd"} Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.884961 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j5ns2" event={"ID":"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb","Type":"ContainerStarted","Data":"b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802"} Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.899328 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-th5tc" podStartSLOduration=1.899170567 podStartE2EDuration="46.899304534s" podCreationTimestamp="2025-09-30 13:38:06 +0000 UTC" firstStartedPulling="2025-09-30 13:38:07.371011694 +0000 UTC m=+159.509571979" lastFinishedPulling="2025-09-30 13:38:52.371145651 +0000 UTC m=+204.509705946" observedRunningTime="2025-09-30 13:38:52.897380716 +0000 UTC m=+205.035941001" watchObservedRunningTime="2025-09-30 13:38:52.899304534 +0000 UTC m=+205.037864819" Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.921235 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j5ns2" podStartSLOduration=2.0423359 podStartE2EDuration="46.921213483s" podCreationTimestamp="2025-09-30 13:38:06 +0000 UTC" firstStartedPulling="2025-09-30 13:38:07.378685622 +0000 UTC m=+159.517245907" lastFinishedPulling="2025-09-30 13:38:52.257563205 +0000 UTC m=+204.396123490" observedRunningTime="2025-09-30 13:38:52.918809692 +0000 UTC m=+205.057369987" watchObservedRunningTime="2025-09-30 13:38:52.921213483 +0000 UTC m=+205.059773768" Sep 30 13:38:52 crc kubenswrapper[4763]: I0930 13:38:52.948448 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-snds8" podStartSLOduration=3.017921207 podStartE2EDuration="46.948428014s" podCreationTimestamp="2025-09-30 13:38:06 +0000 UTC" firstStartedPulling="2025-09-30 13:38:08.388725243 +0000 UTC m=+160.527285528" lastFinishedPulling="2025-09-30 13:38:52.31923202 +0000 UTC m=+204.457792335" observedRunningTime="2025-09-30 13:38:52.946140258 +0000 UTC m=+205.084700553" watchObservedRunningTime="2025-09-30 13:38:52.948428014 +0000 UTC m=+205.086988299" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.095810 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.212657 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cv4d\" (UniqueName: \"kubernetes.io/projected/69483f45-9432-41f3-81e3-eb02b8b1fb00-kube-api-access-6cv4d\") pod \"69483f45-9432-41f3-81e3-eb02b8b1fb00\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.212756 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-catalog-content\") pod \"69483f45-9432-41f3-81e3-eb02b8b1fb00\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.212834 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-utilities\") pod \"69483f45-9432-41f3-81e3-eb02b8b1fb00\" (UID: \"69483f45-9432-41f3-81e3-eb02b8b1fb00\") " Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.213960 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-utilities" (OuterVolumeSpecName: "utilities") pod "69483f45-9432-41f3-81e3-eb02b8b1fb00" (UID: "69483f45-9432-41f3-81e3-eb02b8b1fb00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.223811 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69483f45-9432-41f3-81e3-eb02b8b1fb00-kube-api-access-6cv4d" (OuterVolumeSpecName: "kube-api-access-6cv4d") pod "69483f45-9432-41f3-81e3-eb02b8b1fb00" (UID: "69483f45-9432-41f3-81e3-eb02b8b1fb00"). InnerVolumeSpecName "kube-api-access-6cv4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.299479 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69483f45-9432-41f3-81e3-eb02b8b1fb00" (UID: "69483f45-9432-41f3-81e3-eb02b8b1fb00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.314651 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.314687 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69483f45-9432-41f3-81e3-eb02b8b1fb00-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.314696 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cv4d\" (UniqueName: \"kubernetes.io/projected/69483f45-9432-41f3-81e3-eb02b8b1fb00-kube-api-access-6cv4d\") on node \"crc\" DevicePath \"\"" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.900970 4763 generic.go:334] "Generic (PLEG): container finished" podID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerID="1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a" exitCode=0 Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.901054 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dh7w" event={"ID":"3fe99667-cf30-4112-ae34-a5bdccbfc24f","Type":"ContainerDied","Data":"1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a"} Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.904403 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbzmc" event={"ID":"69483f45-9432-41f3-81e3-eb02b8b1fb00","Type":"ContainerDied","Data":"1de9d49a8787fd34f6e9fac040a5dbc3e6214c70daba79822bd0be4bc9602253"} Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.904456 4763 scope.go:117] "RemoveContainer" containerID="d96c44bf45c97b1ca2d2d13dc5d4b2bfe5415cbf6081f25bbfa2338034f5fa8a" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.904481 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbzmc" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.931403 4763 scope.go:117] "RemoveContainer" containerID="c546156b5b05a65b9872a9b1d96a8cc4b7e3209893e976a3c5c0e31058767872" Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.945295 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bbzmc"] Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.950225 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bbzmc"] Sep 30 13:38:53 crc kubenswrapper[4763]: I0930 13:38:53.959691 4763 scope.go:117] "RemoveContainer" containerID="427e9b6332ef5f5c3c900f35a80eae4c1640e4c183375641eb696e9d15323b62" Sep 30 13:38:54 crc kubenswrapper[4763]: I0930 13:38:54.496714 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69483f45-9432-41f3-81e3-eb02b8b1fb00" path="/var/lib/kubelet/pods/69483f45-9432-41f3-81e3-eb02b8b1fb00/volumes" Sep 30 13:38:54 crc kubenswrapper[4763]: I0930 13:38:54.912868 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dh7w" event={"ID":"3fe99667-cf30-4112-ae34-a5bdccbfc24f","Type":"ContainerStarted","Data":"ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571"} Sep 30 13:38:54 crc kubenswrapper[4763]: I0930 13:38:54.932757 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7dh7w" podStartSLOduration=3.173174724 podStartE2EDuration="46.932740471s" podCreationTimestamp="2025-09-30 13:38:08 +0000 UTC" firstStartedPulling="2025-09-30 13:38:10.50249839 +0000 UTC m=+162.641058675" lastFinishedPulling="2025-09-30 13:38:54.262064137 +0000 UTC m=+206.400624422" observedRunningTime="2025-09-30 13:38:54.931964881 +0000 UTC m=+207.070525166" watchObservedRunningTime="2025-09-30 13:38:54.932740471 +0000 UTC m=+207.071300756" Sep 30 13:38:56 crc kubenswrapper[4763]: I0930 13:38:56.453391 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:56 crc kubenswrapper[4763]: I0930 13:38:56.504163 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:38:56 crc kubenswrapper[4763]: I0930 13:38:56.616743 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:56 crc kubenswrapper[4763]: I0930 13:38:56.616806 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:56 crc kubenswrapper[4763]: I0930 13:38:56.656583 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:38:56 crc kubenswrapper[4763]: I0930 13:38:56.819222 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:56 crc kubenswrapper[4763]: I0930 13:38:56.819275 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:56 crc kubenswrapper[4763]: I0930 13:38:56.867097 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:38:57 crc kubenswrapper[4763]: I0930 13:38:57.011299 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:57 crc kubenswrapper[4763]: I0930 13:38:57.012048 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:57 crc kubenswrapper[4763]: I0930 13:38:57.056312 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:57 crc kubenswrapper[4763]: I0930 13:38:57.969971 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:38:58 crc kubenswrapper[4763]: I0930 13:38:58.626078 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:58 crc kubenswrapper[4763]: I0930 13:38:58.626113 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:58 crc kubenswrapper[4763]: I0930 13:38:58.679363 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:38:59 crc kubenswrapper[4763]: I0930 13:38:59.730976 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-snds8"] Sep 30 13:39:00 crc kubenswrapper[4763]: I0930 13:39:00.952892 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-snds8" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerName="registry-server" containerID="cri-o://c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd" gracePeriod=2 Sep 30 13:39:01 crc kubenswrapper[4763]: I0930 13:39:01.940409 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:39:01 crc kubenswrapper[4763]: I0930 13:39:01.965432 4763 generic.go:334] "Generic (PLEG): container finished" podID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerID="c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd" exitCode=0 Sep 30 13:39:01 crc kubenswrapper[4763]: I0930 13:39:01.965475 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snds8" event={"ID":"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6","Type":"ContainerDied","Data":"c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd"} Sep 30 13:39:01 crc kubenswrapper[4763]: I0930 13:39:01.965501 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snds8" event={"ID":"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6","Type":"ContainerDied","Data":"0e73aa36ad0fb6019fb555e88f1d9bd2240cd0621fe44eb5e0fa88f308d11199"} Sep 30 13:39:01 crc kubenswrapper[4763]: I0930 13:39:01.965516 4763 scope.go:117] "RemoveContainer" containerID="c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd" Sep 30 13:39:01 crc kubenswrapper[4763]: I0930 13:39:01.965617 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snds8" Sep 30 13:39:01 crc kubenswrapper[4763]: I0930 13:39:01.981639 4763 scope.go:117] "RemoveContainer" containerID="703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057" Sep 30 13:39:01 crc kubenswrapper[4763]: I0930 13:39:01.994490 4763 scope.go:117] "RemoveContainer" containerID="1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.010196 4763 scope.go:117] "RemoveContainer" containerID="c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd" Sep 30 13:39:02 crc kubenswrapper[4763]: E0930 13:39:02.010669 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd\": container with ID starting with c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd not found: ID does not exist" containerID="c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.010713 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd"} err="failed to get container status \"c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd\": rpc error: code = NotFound desc = could not find container \"c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd\": container with ID starting with c36cc855e8cd5f9f91e1c8b7e2fbab626bc84385b0e754747746f2225850d7cd not found: ID does not exist" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.010763 4763 scope.go:117] "RemoveContainer" containerID="703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057" Sep 30 13:39:02 crc kubenswrapper[4763]: E0930 13:39:02.011062 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057\": container with ID starting with 703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057 not found: ID does not exist" containerID="703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.011106 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057"} err="failed to get container status \"703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057\": rpc error: code = NotFound desc = could not find container \"703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057\": container with ID starting with 703197551a7d032cd81109d666b805b2b11602b48886ab11faf351f985974057 not found: ID does not exist" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.011139 4763 scope.go:117] "RemoveContainer" containerID="1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b" Sep 30 13:39:02 crc kubenswrapper[4763]: E0930 13:39:02.011384 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b\": container with ID starting with 1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b not found: ID does not exist" containerID="1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.011415 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b"} err="failed to get container status \"1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b\": rpc error: code = NotFound desc = could not find container \"1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b\": container with ID starting with 1defee6b6c230b2f18c79cdbe3222ccef3d0c82add9a8322195f0f7c2f9c743b not found: ID does not exist" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.060435 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-catalog-content\") pod \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.060476 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pln2n\" (UniqueName: \"kubernetes.io/projected/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-kube-api-access-pln2n\") pod \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.060549 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-utilities\") pod \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\" (UID: \"8c12b612-76e8-4dc5-960e-8d6af4e5d1d6\") " Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.061562 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-utilities" (OuterVolumeSpecName: "utilities") pod "8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" (UID: "8c12b612-76e8-4dc5-960e-8d6af4e5d1d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.069704 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-kube-api-access-pln2n" (OuterVolumeSpecName: "kube-api-access-pln2n") pod "8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" (UID: "8c12b612-76e8-4dc5-960e-8d6af4e5d1d6"). InnerVolumeSpecName "kube-api-access-pln2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.103231 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" (UID: "8c12b612-76e8-4dc5-960e-8d6af4e5d1d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.161983 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.162023 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.162041 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pln2n\" (UniqueName: \"kubernetes.io/projected/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6-kube-api-access-pln2n\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.293877 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-snds8"] Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.296128 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-snds8"] Sep 30 13:39:02 crc kubenswrapper[4763]: I0930 13:39:02.496346 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" path="/var/lib/kubelet/pods/8c12b612-76e8-4dc5-960e-8d6af4e5d1d6/volumes" Sep 30 13:39:06 crc kubenswrapper[4763]: I0930 13:39:06.060033 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:39:06 crc kubenswrapper[4763]: I0930 13:39:06.060341 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:39:06 crc kubenswrapper[4763]: I0930 13:39:06.060394 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:39:06 crc kubenswrapper[4763]: I0930 13:39:06.060985 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:39:06 crc kubenswrapper[4763]: I0930 13:39:06.061052 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa" gracePeriod=600 Sep 30 13:39:06 crc kubenswrapper[4763]: I0930 13:39:06.655703 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:39:06 crc kubenswrapper[4763]: I0930 13:39:06.863102 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:39:06 crc kubenswrapper[4763]: I0930 13:39:06.999070 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa" exitCode=0 Sep 30 13:39:06 crc kubenswrapper[4763]: I0930 13:39:06.999135 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa"} Sep 30 13:39:06 crc kubenswrapper[4763]: I0930 13:39:06.999205 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"1e86f4169f74235b6e40ac7fe666fe2e530464ddaaf4bcda5a2f4e63d77e25c9"} Sep 30 13:39:08 crc kubenswrapper[4763]: I0930 13:39:08.059444 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vjdsr"] Sep 30 13:39:08 crc kubenswrapper[4763]: I0930 13:39:08.712097 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-th5tc"] Sep 30 13:39:08 crc kubenswrapper[4763]: I0930 13:39:08.712352 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-th5tc" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerName="registry-server" containerID="cri-o://20070229a288e63a50c15b23bdfe0eaa968b1ec60cb91296a82fa0de1fcd386c" gracePeriod=2 Sep 30 13:39:08 crc kubenswrapper[4763]: I0930 13:39:08.718396 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.025372 4763 generic.go:334] "Generic (PLEG): container finished" podID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerID="20070229a288e63a50c15b23bdfe0eaa968b1ec60cb91296a82fa0de1fcd386c" exitCode=0 Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.025748 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th5tc" event={"ID":"7aa0d48c-e676-4d10-bc96-616a4730715d","Type":"ContainerDied","Data":"20070229a288e63a50c15b23bdfe0eaa968b1ec60cb91296a82fa0de1fcd386c"} Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.099914 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.197270 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-catalog-content\") pod \"7aa0d48c-e676-4d10-bc96-616a4730715d\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.197379 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45rfn\" (UniqueName: \"kubernetes.io/projected/7aa0d48c-e676-4d10-bc96-616a4730715d-kube-api-access-45rfn\") pod \"7aa0d48c-e676-4d10-bc96-616a4730715d\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.197452 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-utilities\") pod \"7aa0d48c-e676-4d10-bc96-616a4730715d\" (UID: \"7aa0d48c-e676-4d10-bc96-616a4730715d\") " Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.198308 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-utilities" (OuterVolumeSpecName: "utilities") pod "7aa0d48c-e676-4d10-bc96-616a4730715d" (UID: "7aa0d48c-e676-4d10-bc96-616a4730715d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.203059 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa0d48c-e676-4d10-bc96-616a4730715d-kube-api-access-45rfn" (OuterVolumeSpecName: "kube-api-access-45rfn") pod "7aa0d48c-e676-4d10-bc96-616a4730715d" (UID: "7aa0d48c-e676-4d10-bc96-616a4730715d"). InnerVolumeSpecName "kube-api-access-45rfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.243008 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7aa0d48c-e676-4d10-bc96-616a4730715d" (UID: "7aa0d48c-e676-4d10-bc96-616a4730715d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.299236 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.299273 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45rfn\" (UniqueName: \"kubernetes.io/projected/7aa0d48c-e676-4d10-bc96-616a4730715d-kube-api-access-45rfn\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:09 crc kubenswrapper[4763]: I0930 13:39:09.299283 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa0d48c-e676-4d10-bc96-616a4730715d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:10 crc kubenswrapper[4763]: I0930 13:39:10.033887 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th5tc" event={"ID":"7aa0d48c-e676-4d10-bc96-616a4730715d","Type":"ContainerDied","Data":"4453ced9808998240425dba0a30251806cc3b6e9bb77bae4f7a0bbdcafc2ca25"} Sep 30 13:39:10 crc kubenswrapper[4763]: I0930 13:39:10.033936 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-th5tc" Sep 30 13:39:10 crc kubenswrapper[4763]: I0930 13:39:10.034198 4763 scope.go:117] "RemoveContainer" containerID="20070229a288e63a50c15b23bdfe0eaa968b1ec60cb91296a82fa0de1fcd386c" Sep 30 13:39:10 crc kubenswrapper[4763]: I0930 13:39:10.051397 4763 scope.go:117] "RemoveContainer" containerID="6b9f9cdcb7f59aa3c3b3b5c95294d7be1f4f9e842d77d774ce6e4868ce55085e" Sep 30 13:39:10 crc kubenswrapper[4763]: I0930 13:39:10.059122 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-th5tc"] Sep 30 13:39:10 crc kubenswrapper[4763]: I0930 13:39:10.063252 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-th5tc"] Sep 30 13:39:10 crc kubenswrapper[4763]: I0930 13:39:10.086682 4763 scope.go:117] "RemoveContainer" containerID="6d0a1494b4a8dc25281fa509f6c87e06193a5e2f06544f75e64c3c174fae6c8b" Sep 30 13:39:10 crc kubenswrapper[4763]: I0930 13:39:10.497016 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" path="/var/lib/kubelet/pods/7aa0d48c-e676-4d10-bc96-616a4730715d/volumes" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.089135 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" podUID="5e6e5780-d702-4c2e-9045-3e74bb98136a" containerName="oauth-openshift" containerID="cri-o://cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2" gracePeriod=15 Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.435769 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.468676 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5795c8b5fb-rzm76"] Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.468954 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98083656-1422-49e1-a791-ed2ae6804df5" containerName="extract-content" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.468969 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="98083656-1422-49e1-a791-ed2ae6804df5" containerName="extract-content" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.468992 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerName="extract-utilities" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469000 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerName="extract-utilities" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469009 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716c94be-c2b0-4523-87a5-505ab53bd786" containerName="pruner" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469020 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="716c94be-c2b0-4523-87a5-505ab53bd786" containerName="pruner" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469031 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98083656-1422-49e1-a791-ed2ae6804df5" containerName="extract-utilities" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469039 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="98083656-1422-49e1-a791-ed2ae6804df5" containerName="extract-utilities" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469046 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98083656-1422-49e1-a791-ed2ae6804df5" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469054 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="98083656-1422-49e1-a791-ed2ae6804df5" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469065 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469072 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469083 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerName="extract-utilities" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469091 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerName="extract-utilities" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469105 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerName="extract-utilities" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469113 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerName="extract-utilities" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469120 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerName="extract-content" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469128 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerName="extract-content" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469136 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469143 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469155 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerName="extract-content" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469163 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerName="extract-content" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469173 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerName="extract-content" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469180 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerName="extract-content" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469192 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469200 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: E0930 13:39:33.469210 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6e5780-d702-4c2e-9045-3e74bb98136a" containerName="oauth-openshift" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469217 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6e5780-d702-4c2e-9045-3e74bb98136a" containerName="oauth-openshift" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469328 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c12b612-76e8-4dc5-960e-8d6af4e5d1d6" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469340 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="69483f45-9432-41f3-81e3-eb02b8b1fb00" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469351 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6e5780-d702-4c2e-9045-3e74bb98136a" containerName="oauth-openshift" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469361 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa0d48c-e676-4d10-bc96-616a4730715d" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469372 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="98083656-1422-49e1-a791-ed2ae6804df5" containerName="registry-server" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.469384 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="716c94be-c2b0-4523-87a5-505ab53bd786" containerName="pruner" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.470686 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.474673 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5795c8b5fb-rzm76"] Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.528528 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.528888 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-session\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.529008 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-audit-policies\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.529038 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7bce04cd-d354-4b70-97ee-8e98062efecc-audit-dir\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.629719 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-login\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.630102 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-error\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.630295 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-router-certs\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.630372 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-cliconfig\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.630753 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-dir\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.630913 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-session\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.630979 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-ocp-branding-template\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631001 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631045 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-idp-0-file-data\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631097 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-trusted-ca-bundle\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631145 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8fbs\" (UniqueName: \"kubernetes.io/projected/5e6e5780-d702-4c2e-9045-3e74bb98136a-kube-api-access-l8fbs\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631227 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-service-ca\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631289 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-provider-selection\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631345 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-serving-cert\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631397 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-policies\") pod \"5e6e5780-d702-4c2e-9045-3e74bb98136a\" (UID: \"5e6e5780-d702-4c2e-9045-3e74bb98136a\") " Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631532 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631570 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631677 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-session\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631733 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-router-certs\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631860 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631935 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc4x8\" (UniqueName: \"kubernetes.io/projected/7bce04cd-d354-4b70-97ee-8e98062efecc-kube-api-access-dc4x8\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.631990 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-template-error\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-audit-policies\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632094 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7bce04cd-d354-4b70-97ee-8e98062efecc-audit-dir\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632131 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-service-ca\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632181 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632316 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632362 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-template-login\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632620 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632633 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632672 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632752 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632775 4763 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632795 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.632814 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.633549 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.634018 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.634090 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7bce04cd-d354-4b70-97ee-8e98062efecc-audit-dir\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.634896 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-audit-policies\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.637787 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6e5780-d702-4c2e-9045-3e74bb98136a-kube-api-access-l8fbs" (OuterVolumeSpecName: "kube-api-access-l8fbs") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "kube-api-access-l8fbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.637945 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.638574 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.639208 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.639445 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.639151 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-session\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.639793 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.640085 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.644827 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.645227 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "5e6e5780-d702-4c2e-9045-3e74bb98136a" (UID: "5e6e5780-d702-4c2e-9045-3e74bb98136a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.734492 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.735312 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.735379 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-template-login\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.736173 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737090 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-router-certs\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737222 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737384 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc4x8\" (UniqueName: \"kubernetes.io/projected/7bce04cd-d354-4b70-97ee-8e98062efecc-kube-api-access-dc4x8\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737494 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-template-error\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737588 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-service-ca\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737707 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737833 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737882 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737913 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737942 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737969 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8fbs\" (UniqueName: \"kubernetes.io/projected/5e6e5780-d702-4c2e-9045-3e74bb98136a-kube-api-access-l8fbs\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.737997 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.738025 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.738057 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.738086 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.738113 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5e6e5780-d702-4c2e-9045-3e74bb98136a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.738685 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.738748 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-service-ca\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.740385 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.740742 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-template-login\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.740928 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.741650 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-router-certs\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.741821 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.743832 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.743972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7bce04cd-d354-4b70-97ee-8e98062efecc-v4-0-config-user-template-error\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.766951 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc4x8\" (UniqueName: \"kubernetes.io/projected/7bce04cd-d354-4b70-97ee-8e98062efecc-kube-api-access-dc4x8\") pod \"oauth-openshift-5795c8b5fb-rzm76\" (UID: \"7bce04cd-d354-4b70-97ee-8e98062efecc\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:33 crc kubenswrapper[4763]: I0930 13:39:33.789773 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.174267 4763 generic.go:334] "Generic (PLEG): container finished" podID="5e6e5780-d702-4c2e-9045-3e74bb98136a" containerID="cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2" exitCode=0 Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.174317 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" event={"ID":"5e6e5780-d702-4c2e-9045-3e74bb98136a","Type":"ContainerDied","Data":"cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2"} Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.174338 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.174375 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vjdsr" event={"ID":"5e6e5780-d702-4c2e-9045-3e74bb98136a","Type":"ContainerDied","Data":"28c73a213de6c8ec410219e24e5f51b54feef7a4b0f003abfe4c06485504e56b"} Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.174423 4763 scope.go:117] "RemoveContainer" containerID="cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2" Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.193611 4763 scope.go:117] "RemoveContainer" containerID="cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2" Sep 30 13:39:34 crc kubenswrapper[4763]: E0930 13:39:34.194045 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2\": container with ID starting with cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2 not found: ID does not exist" containerID="cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2" Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.194091 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2"} err="failed to get container status \"cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2\": rpc error: code = NotFound desc = could not find container \"cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2\": container with ID starting with cd793472fe87217c78a7339f213ae27c57738164e6d9bd5b8917a896cdb44bb2 not found: ID does not exist" Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.210125 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vjdsr"] Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.212749 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vjdsr"] Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.259152 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5795c8b5fb-rzm76"] Sep 30 13:39:34 crc kubenswrapper[4763]: I0930 13:39:34.501490 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e6e5780-d702-4c2e-9045-3e74bb98136a" path="/var/lib/kubelet/pods/5e6e5780-d702-4c2e-9045-3e74bb98136a/volumes" Sep 30 13:39:35 crc kubenswrapper[4763]: I0930 13:39:35.184455 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" event={"ID":"7bce04cd-d354-4b70-97ee-8e98062efecc","Type":"ContainerStarted","Data":"70c210855e1398c37c47adbb364e6122f87bfdf0e549ddd5b9bfa61637061375"} Sep 30 13:39:35 crc kubenswrapper[4763]: I0930 13:39:35.184826 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:35 crc kubenswrapper[4763]: I0930 13:39:35.184842 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" event={"ID":"7bce04cd-d354-4b70-97ee-8e98062efecc","Type":"ContainerStarted","Data":"8b5a9bf54106e303d5c55221b236aecf9cdd7bccf179c74ef381d1987dc761f0"} Sep 30 13:39:35 crc kubenswrapper[4763]: I0930 13:39:35.197263 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" Sep 30 13:39:35 crc kubenswrapper[4763]: I0930 13:39:35.223860 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5795c8b5fb-rzm76" podStartSLOduration=27.223722344 podStartE2EDuration="27.223722344s" podCreationTimestamp="2025-09-30 13:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:39:35.216276388 +0000 UTC m=+247.354836673" watchObservedRunningTime="2025-09-30 13:39:35.223722344 +0000 UTC m=+247.362282669" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.207830 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j5ns2"] Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.208702 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j5ns2" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerName="registry-server" containerID="cri-o://b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802" gracePeriod=30 Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.225261 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6svmn"] Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.225651 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6svmn" podUID="783a0279-c32d-4085-9274-6291b7544803" containerName="registry-server" containerID="cri-o://8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4" gracePeriod=30 Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.231555 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dqjfv"] Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.231777 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" podUID="2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" containerName="marketplace-operator" containerID="cri-o://02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e" gracePeriod=30 Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.240559 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dh7w"] Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.241105 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7dh7w" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerName="registry-server" containerID="cri-o://ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571" gracePeriod=30 Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.256428 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qd4l"] Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.256984 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5qd4l" podUID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerName="registry-server" containerID="cri-o://7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424" gracePeriod=30 Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.260883 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58tml"] Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.261528 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.268175 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58tml"] Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.313195 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fhqs\" (UniqueName: \"kubernetes.io/projected/50a4b247-74a5-4ceb-a32c-c92fce4f11b2-kube-api-access-4fhqs\") pod \"marketplace-operator-79b997595-58tml\" (UID: \"50a4b247-74a5-4ceb-a32c-c92fce4f11b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.313288 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50a4b247-74a5-4ceb-a32c-c92fce4f11b2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-58tml\" (UID: \"50a4b247-74a5-4ceb-a32c-c92fce4f11b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.313339 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50a4b247-74a5-4ceb-a32c-c92fce4f11b2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-58tml\" (UID: \"50a4b247-74a5-4ceb-a32c-c92fce4f11b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.416235 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50a4b247-74a5-4ceb-a32c-c92fce4f11b2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-58tml\" (UID: \"50a4b247-74a5-4ceb-a32c-c92fce4f11b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.416735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fhqs\" (UniqueName: \"kubernetes.io/projected/50a4b247-74a5-4ceb-a32c-c92fce4f11b2-kube-api-access-4fhqs\") pod \"marketplace-operator-79b997595-58tml\" (UID: \"50a4b247-74a5-4ceb-a32c-c92fce4f11b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.416805 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50a4b247-74a5-4ceb-a32c-c92fce4f11b2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-58tml\" (UID: \"50a4b247-74a5-4ceb-a32c-c92fce4f11b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.420064 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50a4b247-74a5-4ceb-a32c-c92fce4f11b2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-58tml\" (UID: \"50a4b247-74a5-4ceb-a32c-c92fce4f11b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.423570 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50a4b247-74a5-4ceb-a32c-c92fce4f11b2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-58tml\" (UID: \"50a4b247-74a5-4ceb-a32c-c92fce4f11b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.435558 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fhqs\" (UniqueName: \"kubernetes.io/projected/50a4b247-74a5-4ceb-a32c-c92fce4f11b2-kube-api-access-4fhqs\") pod \"marketplace-operator-79b997595-58tml\" (UID: \"50a4b247-74a5-4ceb-a32c-c92fce4f11b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.583155 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.669178 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.677224 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.679081 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.691050 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.701546 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.720451 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-utilities\") pod \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.720486 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-utilities\") pod \"783a0279-c32d-4085-9274-6291b7544803\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.720532 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr5zg\" (UniqueName: \"kubernetes.io/projected/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-kube-api-access-zr5zg\") pod \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.720556 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-catalog-content\") pod \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.720580 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khsj2\" (UniqueName: \"kubernetes.io/projected/783a0279-c32d-4085-9274-6291b7544803-kube-api-access-khsj2\") pod \"783a0279-c32d-4085-9274-6291b7544803\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.720637 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chk96\" (UniqueName: \"kubernetes.io/projected/3fe99667-cf30-4112-ae34-a5bdccbfc24f-kube-api-access-chk96\") pod \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.720672 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4kvs\" (UniqueName: \"kubernetes.io/projected/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-kube-api-access-g4kvs\") pod \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.720695 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-operator-metrics\") pod \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.720715 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-utilities\") pod \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\" (UID: \"3fe99667-cf30-4112-ae34-a5bdccbfc24f\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.721114 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-catalog-content\") pod \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.721165 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srz4b\" (UniqueName: \"kubernetes.io/projected/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-kube-api-access-srz4b\") pod \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.721200 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-catalog-content\") pod \"783a0279-c32d-4085-9274-6291b7544803\" (UID: \"783a0279-c32d-4085-9274-6291b7544803\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.721214 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-catalog-content\") pod \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\" (UID: \"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.721233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-trusted-ca\") pod \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\" (UID: \"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.721249 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-utilities\") pod \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\" (UID: \"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a\") " Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.723030 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-utilities" (OuterVolumeSpecName: "utilities") pod "d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" (UID: "d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.723638 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-utilities" (OuterVolumeSpecName: "utilities") pod "b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" (UID: "b0ac82cc-de99-4b1b-a8f8-c36b5037cafb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.724242 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-utilities" (OuterVolumeSpecName: "utilities") pod "783a0279-c32d-4085-9274-6291b7544803" (UID: "783a0279-c32d-4085-9274-6291b7544803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.728636 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-kube-api-access-zr5zg" (OuterVolumeSpecName: "kube-api-access-zr5zg") pod "d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" (UID: "d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a"). InnerVolumeSpecName "kube-api-access-zr5zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.730683 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-utilities" (OuterVolumeSpecName: "utilities") pod "3fe99667-cf30-4112-ae34-a5bdccbfc24f" (UID: "3fe99667-cf30-4112-ae34-a5bdccbfc24f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.731371 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" (UID: "2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.732843 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-kube-api-access-g4kvs" (OuterVolumeSpecName: "kube-api-access-g4kvs") pod "b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" (UID: "b0ac82cc-de99-4b1b-a8f8-c36b5037cafb"). InnerVolumeSpecName "kube-api-access-g4kvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.734502 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" (UID: "2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.735174 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-kube-api-access-srz4b" (OuterVolumeSpecName: "kube-api-access-srz4b") pod "2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" (UID: "2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9"). InnerVolumeSpecName "kube-api-access-srz4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.736859 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783a0279-c32d-4085-9274-6291b7544803-kube-api-access-khsj2" (OuterVolumeSpecName: "kube-api-access-khsj2") pod "783a0279-c32d-4085-9274-6291b7544803" (UID: "783a0279-c32d-4085-9274-6291b7544803"). InnerVolumeSpecName "kube-api-access-khsj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.744430 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fe99667-cf30-4112-ae34-a5bdccbfc24f" (UID: "3fe99667-cf30-4112-ae34-a5bdccbfc24f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.751874 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fe99667-cf30-4112-ae34-a5bdccbfc24f-kube-api-access-chk96" (OuterVolumeSpecName: "kube-api-access-chk96") pod "3fe99667-cf30-4112-ae34-a5bdccbfc24f" (UID: "3fe99667-cf30-4112-ae34-a5bdccbfc24f"). InnerVolumeSpecName "kube-api-access-chk96". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.788645 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" (UID: "b0ac82cc-de99-4b1b-a8f8-c36b5037cafb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.812658 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "783a0279-c32d-4085-9274-6291b7544803" (UID: "783a0279-c32d-4085-9274-6291b7544803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822484 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srz4b\" (UniqueName: \"kubernetes.io/projected/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-kube-api-access-srz4b\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822514 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822528 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822542 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822553 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822563 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822572 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783a0279-c32d-4085-9274-6291b7544803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822582 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr5zg\" (UniqueName: \"kubernetes.io/projected/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-kube-api-access-zr5zg\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822593 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822618 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khsj2\" (UniqueName: \"kubernetes.io/projected/783a0279-c32d-4085-9274-6291b7544803-kube-api-access-khsj2\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822627 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chk96\" (UniqueName: \"kubernetes.io/projected/3fe99667-cf30-4112-ae34-a5bdccbfc24f-kube-api-access-chk96\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822636 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4kvs\" (UniqueName: \"kubernetes.io/projected/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb-kube-api-access-g4kvs\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822644 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.822653 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fe99667-cf30-4112-ae34-a5bdccbfc24f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.843357 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" (UID: "d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.923700 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:47 crc kubenswrapper[4763]: I0930 13:39:47.949773 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58tml"] Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.273093 4763 generic.go:334] "Generic (PLEG): container finished" podID="2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" containerID="02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e" exitCode=0 Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.273191 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.273188 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" event={"ID":"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9","Type":"ContainerDied","Data":"02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.274421 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dqjfv" event={"ID":"2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9","Type":"ContainerDied","Data":"505b873f37846c4ab50c2934b683230b40be4be7a3b5181c69fbe98d4b54c239"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.274449 4763 scope.go:117] "RemoveContainer" containerID="02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.276115 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-58tml" event={"ID":"50a4b247-74a5-4ceb-a32c-c92fce4f11b2","Type":"ContainerStarted","Data":"071da28d5b1b9f4cac9154bcab51930b1ada1024f5a7b407be076c5439d86d85"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.276161 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-58tml" event={"ID":"50a4b247-74a5-4ceb-a32c-c92fce4f11b2","Type":"ContainerStarted","Data":"4590839b6621793a4689ca56ce7653b02866ed3e27608a0cab48b43d049c0d59"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.276332 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.278153 4763 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-58tml container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.278211 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-58tml" podUID="50a4b247-74a5-4ceb-a32c-c92fce4f11b2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.278380 4763 generic.go:334] "Generic (PLEG): container finished" podID="783a0279-c32d-4085-9274-6291b7544803" containerID="8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4" exitCode=0 Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.278478 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6svmn" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.278466 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6svmn" event={"ID":"783a0279-c32d-4085-9274-6291b7544803","Type":"ContainerDied","Data":"8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.278545 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6svmn" event={"ID":"783a0279-c32d-4085-9274-6291b7544803","Type":"ContainerDied","Data":"26d1e419cbe94bb5a95e50e00f6b565a22bb8417090c864d662213fb39c5c6d1"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.283259 4763 generic.go:334] "Generic (PLEG): container finished" podID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerID="b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802" exitCode=0 Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.283331 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j5ns2" event={"ID":"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb","Type":"ContainerDied","Data":"b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.283358 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j5ns2" event={"ID":"b0ac82cc-de99-4b1b-a8f8-c36b5037cafb","Type":"ContainerDied","Data":"9e94679ac3cced12a5a85d8138d3948709eb3aaa3ebb1ee2265ab8217cdf1c7a"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.283454 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j5ns2" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.286537 4763 generic.go:334] "Generic (PLEG): container finished" podID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerID="ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571" exitCode=0 Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.286630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dh7w" event={"ID":"3fe99667-cf30-4112-ae34-a5bdccbfc24f","Type":"ContainerDied","Data":"ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.286669 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dh7w" event={"ID":"3fe99667-cf30-4112-ae34-a5bdccbfc24f","Type":"ContainerDied","Data":"d872bec5fe4f7d1e14b93c270d7010ed18826c54d51d2ab96ee8d81540796e00"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.286751 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dh7w" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.289447 4763 generic.go:334] "Generic (PLEG): container finished" podID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerID="7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424" exitCode=0 Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.289574 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qd4l" event={"ID":"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a","Type":"ContainerDied","Data":"7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.289623 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qd4l" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.289648 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qd4l" event={"ID":"d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a","Type":"ContainerDied","Data":"3fdf753f6ea5040413cbee858409cca2704f7d226f377e1996ed6858d9379567"} Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.291037 4763 scope.go:117] "RemoveContainer" containerID="02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.291442 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e\": container with ID starting with 02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e not found: ID does not exist" containerID="02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.291526 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e"} err="failed to get container status \"02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e\": rpc error: code = NotFound desc = could not find container \"02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e\": container with ID starting with 02fc944360e07321c7639886edd606c7b9e6055a616f63f801d8e12e499ee50e not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.291615 4763 scope.go:117] "RemoveContainer" containerID="8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.302890 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-58tml" podStartSLOduration=1.3028727359999999 podStartE2EDuration="1.302872736s" podCreationTimestamp="2025-09-30 13:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:39:48.297721218 +0000 UTC m=+260.436281513" watchObservedRunningTime="2025-09-30 13:39:48.302872736 +0000 UTC m=+260.441433021" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.312759 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dqjfv"] Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.315011 4763 scope.go:117] "RemoveContainer" containerID="3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.324229 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dqjfv"] Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.339092 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6svmn"] Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.352146 4763 scope.go:117] "RemoveContainer" containerID="1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.352921 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6svmn"] Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.356690 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dh7w"] Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.371478 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dh7w"] Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.387930 4763 scope.go:117] "RemoveContainer" containerID="8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.390127 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4\": container with ID starting with 8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4 not found: ID does not exist" containerID="8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.390186 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4"} err="failed to get container status \"8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4\": rpc error: code = NotFound desc = could not find container \"8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4\": container with ID starting with 8ed68660ef5c5e5bfdceae75349c230095275b87fdbd863f9601ad32e406a2b4 not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.390213 4763 scope.go:117] "RemoveContainer" containerID="3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.391196 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j5ns2"] Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.393832 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d\": container with ID starting with 3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d not found: ID does not exist" containerID="3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.393867 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d"} err="failed to get container status \"3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d\": rpc error: code = NotFound desc = could not find container \"3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d\": container with ID starting with 3afda04886cd1b2ef9dc8b6940c6df0b3d2d803e5449b48b3aa0f3667b03837d not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.393890 4763 scope.go:117] "RemoveContainer" containerID="1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.394126 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c\": container with ID starting with 1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c not found: ID does not exist" containerID="1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.394145 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c"} err="failed to get container status \"1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c\": rpc error: code = NotFound desc = could not find container \"1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c\": container with ID starting with 1c604a657a4816c18904f32621fb984578d1880dffcd46e6e014344a34866f0c not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.394157 4763 scope.go:117] "RemoveContainer" containerID="b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.394383 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j5ns2"] Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.397803 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qd4l"] Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.400083 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5qd4l"] Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.409727 4763 scope.go:117] "RemoveContainer" containerID="b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.424570 4763 scope.go:117] "RemoveContainer" containerID="66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.436888 4763 scope.go:117] "RemoveContainer" containerID="b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.437284 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802\": container with ID starting with b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802 not found: ID does not exist" containerID="b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.437322 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802"} err="failed to get container status \"b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802\": rpc error: code = NotFound desc = could not find container \"b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802\": container with ID starting with b28fe141a843ad802bf522e87a48eebda17bcc2206659b260aff38b713059802 not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.437349 4763 scope.go:117] "RemoveContainer" containerID="b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.437948 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467\": container with ID starting with b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467 not found: ID does not exist" containerID="b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.437979 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467"} err="failed to get container status \"b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467\": rpc error: code = NotFound desc = could not find container \"b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467\": container with ID starting with b32c51f3fb059dec6462c3288c928cd52679d4d00bb400421111c6f2570cd467 not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.438004 4763 scope.go:117] "RemoveContainer" containerID="66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.438258 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1\": container with ID starting with 66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1 not found: ID does not exist" containerID="66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.438293 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1"} err="failed to get container status \"66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1\": rpc error: code = NotFound desc = could not find container \"66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1\": container with ID starting with 66a1792790848c0c6df748b1df400a8b74f33d2b2d293bf8b1c8adce639d75a1 not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.438318 4763 scope.go:117] "RemoveContainer" containerID="ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.451429 4763 scope.go:117] "RemoveContainer" containerID="1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.462301 4763 scope.go:117] "RemoveContainer" containerID="37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.477237 4763 scope.go:117] "RemoveContainer" containerID="ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.477967 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571\": container with ID starting with ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571 not found: ID does not exist" containerID="ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.477998 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571"} err="failed to get container status \"ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571\": rpc error: code = NotFound desc = could not find container \"ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571\": container with ID starting with ff7caa826b34bb0f43e6ee99da5d46a10b9e5a91a7ef3457e73322b52866c571 not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.478022 4763 scope.go:117] "RemoveContainer" containerID="1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.478361 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a\": container with ID starting with 1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a not found: ID does not exist" containerID="1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.478402 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a"} err="failed to get container status \"1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a\": rpc error: code = NotFound desc = could not find container \"1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a\": container with ID starting with 1acf90996e39efe5abf11e96438794c1e23f5fcb16cc11bc07ed4dd2daefe97a not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.478431 4763 scope.go:117] "RemoveContainer" containerID="37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.481100 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0\": container with ID starting with 37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0 not found: ID does not exist" containerID="37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.481126 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0"} err="failed to get container status \"37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0\": rpc error: code = NotFound desc = could not find container \"37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0\": container with ID starting with 37df57789e38ce3116ece041145a4fddd1558d11f1f5e17e0d016cfebf1ee6e0 not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.481163 4763 scope.go:117] "RemoveContainer" containerID="7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.495085 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" path="/var/lib/kubelet/pods/2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9/volumes" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.495511 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" path="/var/lib/kubelet/pods/3fe99667-cf30-4112-ae34-a5bdccbfc24f/volumes" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.496284 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783a0279-c32d-4085-9274-6291b7544803" path="/var/lib/kubelet/pods/783a0279-c32d-4085-9274-6291b7544803/volumes" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.497438 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" path="/var/lib/kubelet/pods/b0ac82cc-de99-4b1b-a8f8-c36b5037cafb/volumes" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.498026 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" path="/var/lib/kubelet/pods/d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a/volumes" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.499366 4763 scope.go:117] "RemoveContainer" containerID="c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.517379 4763 scope.go:117] "RemoveContainer" containerID="682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.535181 4763 scope.go:117] "RemoveContainer" containerID="7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.536306 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424\": container with ID starting with 7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424 not found: ID does not exist" containerID="7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.536338 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424"} err="failed to get container status \"7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424\": rpc error: code = NotFound desc = could not find container \"7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424\": container with ID starting with 7b0a3839a07b62feb6b5160e61ed076b5959a8105c881b4d54bd2f82d8ed8424 not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.536408 4763 scope.go:117] "RemoveContainer" containerID="c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.537002 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b\": container with ID starting with c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b not found: ID does not exist" containerID="c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.537214 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b"} err="failed to get container status \"c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b\": rpc error: code = NotFound desc = could not find container \"c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b\": container with ID starting with c2b269afe0d77c4a7e6914dee827bc9ebe0478042c65195683950e0d0e13b87b not found: ID does not exist" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.537233 4763 scope.go:117] "RemoveContainer" containerID="682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97" Sep 30 13:39:48 crc kubenswrapper[4763]: E0930 13:39:48.538386 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97\": container with ID starting with 682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97 not found: ID does not exist" containerID="682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97" Sep 30 13:39:48 crc kubenswrapper[4763]: I0930 13:39:48.538419 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97"} err="failed to get container status \"682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97\": rpc error: code = NotFound desc = could not find container \"682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97\": container with ID starting with 682775976d46fdbd58d5d7f21245bcffb3c88acb8466615e72422cd701a4fb97 not found: ID does not exist" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.219829 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5nfg4"] Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220042 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerName="extract-content" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220057 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerName="extract-content" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220071 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220078 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220088 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerName="extract-content" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220096 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerName="extract-content" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220105 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783a0279-c32d-4085-9274-6291b7544803" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220111 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="783a0279-c32d-4085-9274-6291b7544803" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220122 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" containerName="marketplace-operator" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220128 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" containerName="marketplace-operator" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220137 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerName="extract-utilities" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220143 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerName="extract-utilities" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220152 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220158 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220165 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerName="extract-utilities" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220171 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerName="extract-utilities" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220180 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerName="extract-utilities" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220186 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerName="extract-utilities" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220192 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783a0279-c32d-4085-9274-6291b7544803" containerName="extract-content" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220198 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="783a0279-c32d-4085-9274-6291b7544803" containerName="extract-content" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220206 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220211 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220219 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783a0279-c32d-4085-9274-6291b7544803" containerName="extract-utilities" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220224 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="783a0279-c32d-4085-9274-6291b7544803" containerName="extract-utilities" Sep 30 13:39:49 crc kubenswrapper[4763]: E0930 13:39:49.220234 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerName="extract-content" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220240 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerName="extract-content" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220323 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="783a0279-c32d-4085-9274-6291b7544803" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220331 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac82cc-de99-4b1b-a8f8-c36b5037cafb" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220340 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fe99667-cf30-4112-ae34-a5bdccbfc24f" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220347 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d23f5ffd-eee7-4ca7-b13f-c1175ab7eb9a" containerName="registry-server" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.220359 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef87fb1-4e1e-4c2b-b0c2-c96a4fd4e1b9" containerName="marketplace-operator" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.221309 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.222886 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.227587 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5nfg4"] Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.302913 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-58tml" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.342341 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5n95\" (UniqueName: \"kubernetes.io/projected/a8295f8d-50ee-49f5-890a-77e5bb976ce4-kube-api-access-x5n95\") pod \"certified-operators-5nfg4\" (UID: \"a8295f8d-50ee-49f5-890a-77e5bb976ce4\") " pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.342384 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8295f8d-50ee-49f5-890a-77e5bb976ce4-utilities\") pod \"certified-operators-5nfg4\" (UID: \"a8295f8d-50ee-49f5-890a-77e5bb976ce4\") " pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.342428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8295f8d-50ee-49f5-890a-77e5bb976ce4-catalog-content\") pod \"certified-operators-5nfg4\" (UID: \"a8295f8d-50ee-49f5-890a-77e5bb976ce4\") " pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.444081 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5n95\" (UniqueName: \"kubernetes.io/projected/a8295f8d-50ee-49f5-890a-77e5bb976ce4-kube-api-access-x5n95\") pod \"certified-operators-5nfg4\" (UID: \"a8295f8d-50ee-49f5-890a-77e5bb976ce4\") " pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.444122 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8295f8d-50ee-49f5-890a-77e5bb976ce4-utilities\") pod \"certified-operators-5nfg4\" (UID: \"a8295f8d-50ee-49f5-890a-77e5bb976ce4\") " pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.444141 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8295f8d-50ee-49f5-890a-77e5bb976ce4-catalog-content\") pod \"certified-operators-5nfg4\" (UID: \"a8295f8d-50ee-49f5-890a-77e5bb976ce4\") " pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.444559 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8295f8d-50ee-49f5-890a-77e5bb976ce4-catalog-content\") pod \"certified-operators-5nfg4\" (UID: \"a8295f8d-50ee-49f5-890a-77e5bb976ce4\") " pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.444802 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8295f8d-50ee-49f5-890a-77e5bb976ce4-utilities\") pod \"certified-operators-5nfg4\" (UID: \"a8295f8d-50ee-49f5-890a-77e5bb976ce4\") " pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.466042 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5n95\" (UniqueName: \"kubernetes.io/projected/a8295f8d-50ee-49f5-890a-77e5bb976ce4-kube-api-access-x5n95\") pod \"certified-operators-5nfg4\" (UID: \"a8295f8d-50ee-49f5-890a-77e5bb976ce4\") " pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.547265 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.717222 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5nfg4"] Sep 30 13:39:49 crc kubenswrapper[4763]: W0930 13:39:49.726156 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8295f8d_50ee_49f5_890a_77e5bb976ce4.slice/crio-99aed61690594f7a74e2513e3ecd89283153fadf7889b03ac69daadb36df9c21 WatchSource:0}: Error finding container 99aed61690594f7a74e2513e3ecd89283153fadf7889b03ac69daadb36df9c21: Status 404 returned error can't find the container with id 99aed61690594f7a74e2513e3ecd89283153fadf7889b03ac69daadb36df9c21 Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.826791 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9j8qz"] Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.828163 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.832818 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.836877 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j8qz"] Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.949230 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bab0204-e2f4-4666-a525-ce8b8cea5f17-utilities\") pod \"redhat-marketplace-9j8qz\" (UID: \"6bab0204-e2f4-4666-a525-ce8b8cea5f17\") " pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.949285 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bab0204-e2f4-4666-a525-ce8b8cea5f17-catalog-content\") pod \"redhat-marketplace-9j8qz\" (UID: \"6bab0204-e2f4-4666-a525-ce8b8cea5f17\") " pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:49 crc kubenswrapper[4763]: I0930 13:39:49.949323 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j65z4\" (UniqueName: \"kubernetes.io/projected/6bab0204-e2f4-4666-a525-ce8b8cea5f17-kube-api-access-j65z4\") pod \"redhat-marketplace-9j8qz\" (UID: \"6bab0204-e2f4-4666-a525-ce8b8cea5f17\") " pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.050565 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j65z4\" (UniqueName: \"kubernetes.io/projected/6bab0204-e2f4-4666-a525-ce8b8cea5f17-kube-api-access-j65z4\") pod \"redhat-marketplace-9j8qz\" (UID: \"6bab0204-e2f4-4666-a525-ce8b8cea5f17\") " pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.050718 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bab0204-e2f4-4666-a525-ce8b8cea5f17-utilities\") pod \"redhat-marketplace-9j8qz\" (UID: \"6bab0204-e2f4-4666-a525-ce8b8cea5f17\") " pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.050742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bab0204-e2f4-4666-a525-ce8b8cea5f17-catalog-content\") pod \"redhat-marketplace-9j8qz\" (UID: \"6bab0204-e2f4-4666-a525-ce8b8cea5f17\") " pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.051212 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bab0204-e2f4-4666-a525-ce8b8cea5f17-catalog-content\") pod \"redhat-marketplace-9j8qz\" (UID: \"6bab0204-e2f4-4666-a525-ce8b8cea5f17\") " pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.051454 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bab0204-e2f4-4666-a525-ce8b8cea5f17-utilities\") pod \"redhat-marketplace-9j8qz\" (UID: \"6bab0204-e2f4-4666-a525-ce8b8cea5f17\") " pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.078965 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j65z4\" (UniqueName: \"kubernetes.io/projected/6bab0204-e2f4-4666-a525-ce8b8cea5f17-kube-api-access-j65z4\") pod \"redhat-marketplace-9j8qz\" (UID: \"6bab0204-e2f4-4666-a525-ce8b8cea5f17\") " pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.147770 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.308438 4763 generic.go:334] "Generic (PLEG): container finished" podID="a8295f8d-50ee-49f5-890a-77e5bb976ce4" containerID="dfe4999813c086c58598eaf849bb4cd92b7357f7da23f43eed4607d6dd54b711" exitCode=0 Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.308579 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nfg4" event={"ID":"a8295f8d-50ee-49f5-890a-77e5bb976ce4","Type":"ContainerDied","Data":"dfe4999813c086c58598eaf849bb4cd92b7357f7da23f43eed4607d6dd54b711"} Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.308838 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nfg4" event={"ID":"a8295f8d-50ee-49f5-890a-77e5bb976ce4","Type":"ContainerStarted","Data":"99aed61690594f7a74e2513e3ecd89283153fadf7889b03ac69daadb36df9c21"} Sep 30 13:39:50 crc kubenswrapper[4763]: I0930 13:39:50.388993 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9j8qz"] Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.314727 4763 generic.go:334] "Generic (PLEG): container finished" podID="a8295f8d-50ee-49f5-890a-77e5bb976ce4" containerID="bacc0b6026e7aa1a19b23da6ec9d7fff4309f33e95395725f4fa3e1f4e945354" exitCode=0 Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.314788 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nfg4" event={"ID":"a8295f8d-50ee-49f5-890a-77e5bb976ce4","Type":"ContainerDied","Data":"bacc0b6026e7aa1a19b23da6ec9d7fff4309f33e95395725f4fa3e1f4e945354"} Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.318014 4763 generic.go:334] "Generic (PLEG): container finished" podID="6bab0204-e2f4-4666-a525-ce8b8cea5f17" containerID="2180cf230b857de2f3c6a2e9d8d4a5a3f8bc9f8e394342459dea0d6e115a5d9d" exitCode=0 Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.318041 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j8qz" event={"ID":"6bab0204-e2f4-4666-a525-ce8b8cea5f17","Type":"ContainerDied","Data":"2180cf230b857de2f3c6a2e9d8d4a5a3f8bc9f8e394342459dea0d6e115a5d9d"} Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.318065 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j8qz" event={"ID":"6bab0204-e2f4-4666-a525-ce8b8cea5f17","Type":"ContainerStarted","Data":"e30f1dd55363d0b8167ae7b624412165b86e6bb9adb3b517d2f31b5a1511826a"} Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.620917 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h4fzf"] Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.621892 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.623839 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.634762 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4fzf"] Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.673035 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbpj\" (UniqueName: \"kubernetes.io/projected/c69337fe-42df-4d48-8254-9408d35e644c-kube-api-access-kxbpj\") pod \"redhat-operators-h4fzf\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.673160 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-catalog-content\") pod \"redhat-operators-h4fzf\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.673251 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-utilities\") pod \"redhat-operators-h4fzf\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.774744 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbpj\" (UniqueName: \"kubernetes.io/projected/c69337fe-42df-4d48-8254-9408d35e644c-kube-api-access-kxbpj\") pod \"redhat-operators-h4fzf\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.775068 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-catalog-content\") pod \"redhat-operators-h4fzf\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.775481 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-catalog-content\") pod \"redhat-operators-h4fzf\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.776274 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-utilities\") pod \"redhat-operators-h4fzf\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.776626 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-utilities\") pod \"redhat-operators-h4fzf\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.794610 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbpj\" (UniqueName: \"kubernetes.io/projected/c69337fe-42df-4d48-8254-9408d35e644c-kube-api-access-kxbpj\") pod \"redhat-operators-h4fzf\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:51 crc kubenswrapper[4763]: I0930 13:39:51.960089 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.152186 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4fzf"] Sep 30 13:39:52 crc kubenswrapper[4763]: W0930 13:39:52.157481 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc69337fe_42df_4d48_8254_9408d35e644c.slice/crio-5a0f38d3558878cafa483b1ba5cb7383025c38312a61ff0134add7c80fdda98f WatchSource:0}: Error finding container 5a0f38d3558878cafa483b1ba5cb7383025c38312a61ff0134add7c80fdda98f: Status 404 returned error can't find the container with id 5a0f38d3558878cafa483b1ba5cb7383025c38312a61ff0134add7c80fdda98f Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.219312 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b6brb"] Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.220339 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.222649 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.235924 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b6brb"] Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.282929 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqlkl\" (UniqueName: \"kubernetes.io/projected/adf74762-2792-4fe2-8ce5-e5e7c7f88469-kube-api-access-zqlkl\") pod \"community-operators-b6brb\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.283065 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-utilities\") pod \"community-operators-b6brb\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.283115 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-catalog-content\") pod \"community-operators-b6brb\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.324060 4763 generic.go:334] "Generic (PLEG): container finished" podID="c69337fe-42df-4d48-8254-9408d35e644c" containerID="67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee" exitCode=0 Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.324213 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4fzf" event={"ID":"c69337fe-42df-4d48-8254-9408d35e644c","Type":"ContainerDied","Data":"67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee"} Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.324409 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4fzf" event={"ID":"c69337fe-42df-4d48-8254-9408d35e644c","Type":"ContainerStarted","Data":"5a0f38d3558878cafa483b1ba5cb7383025c38312a61ff0134add7c80fdda98f"} Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.328033 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j8qz" event={"ID":"6bab0204-e2f4-4666-a525-ce8b8cea5f17","Type":"ContainerStarted","Data":"6f3a52670d6948552cc52363f48f59dc3c1add9fd7796bb31432a665065966ea"} Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.331081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nfg4" event={"ID":"a8295f8d-50ee-49f5-890a-77e5bb976ce4","Type":"ContainerStarted","Data":"bc6966e211f8f8dd6c5d2c78ba36bc8e1f99f5e0d6af48c73cc7ba1b99b4f6dc"} Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.365789 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5nfg4" podStartSLOduration=1.745770799 podStartE2EDuration="3.365770455s" podCreationTimestamp="2025-09-30 13:39:49 +0000 UTC" firstStartedPulling="2025-09-30 13:39:50.311453588 +0000 UTC m=+262.450013873" lastFinishedPulling="2025-09-30 13:39:51.931453244 +0000 UTC m=+264.070013529" observedRunningTime="2025-09-30 13:39:52.363413292 +0000 UTC m=+264.501973577" watchObservedRunningTime="2025-09-30 13:39:52.365770455 +0000 UTC m=+264.504330740" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.384867 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqlkl\" (UniqueName: \"kubernetes.io/projected/adf74762-2792-4fe2-8ce5-e5e7c7f88469-kube-api-access-zqlkl\") pod \"community-operators-b6brb\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.384940 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-utilities\") pod \"community-operators-b6brb\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.384972 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-catalog-content\") pod \"community-operators-b6brb\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.385478 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-utilities\") pod \"community-operators-b6brb\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.385547 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-catalog-content\") pod \"community-operators-b6brb\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.408546 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqlkl\" (UniqueName: \"kubernetes.io/projected/adf74762-2792-4fe2-8ce5-e5e7c7f88469-kube-api-access-zqlkl\") pod \"community-operators-b6brb\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.545211 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:39:52 crc kubenswrapper[4763]: I0930 13:39:52.759881 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b6brb"] Sep 30 13:39:52 crc kubenswrapper[4763]: W0930 13:39:52.771373 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf74762_2792_4fe2_8ce5_e5e7c7f88469.slice/crio-896814538cf8adff64302e6c7bc7d87dcbc0ef26731d607b9a39ef7ec3e13cbc WatchSource:0}: Error finding container 896814538cf8adff64302e6c7bc7d87dcbc0ef26731d607b9a39ef7ec3e13cbc: Status 404 returned error can't find the container with id 896814538cf8adff64302e6c7bc7d87dcbc0ef26731d607b9a39ef7ec3e13cbc Sep 30 13:39:53 crc kubenswrapper[4763]: I0930 13:39:53.338050 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4fzf" event={"ID":"c69337fe-42df-4d48-8254-9408d35e644c","Type":"ContainerStarted","Data":"0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80"} Sep 30 13:39:53 crc kubenswrapper[4763]: I0930 13:39:53.341133 4763 generic.go:334] "Generic (PLEG): container finished" podID="6bab0204-e2f4-4666-a525-ce8b8cea5f17" containerID="6f3a52670d6948552cc52363f48f59dc3c1add9fd7796bb31432a665065966ea" exitCode=0 Sep 30 13:39:53 crc kubenswrapper[4763]: I0930 13:39:53.341213 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j8qz" event={"ID":"6bab0204-e2f4-4666-a525-ce8b8cea5f17","Type":"ContainerDied","Data":"6f3a52670d6948552cc52363f48f59dc3c1add9fd7796bb31432a665065966ea"} Sep 30 13:39:53 crc kubenswrapper[4763]: I0930 13:39:53.343438 4763 generic.go:334] "Generic (PLEG): container finished" podID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerID="fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d" exitCode=0 Sep 30 13:39:53 crc kubenswrapper[4763]: I0930 13:39:53.344486 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6brb" event={"ID":"adf74762-2792-4fe2-8ce5-e5e7c7f88469","Type":"ContainerDied","Data":"fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d"} Sep 30 13:39:53 crc kubenswrapper[4763]: I0930 13:39:53.344512 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6brb" event={"ID":"adf74762-2792-4fe2-8ce5-e5e7c7f88469","Type":"ContainerStarted","Data":"896814538cf8adff64302e6c7bc7d87dcbc0ef26731d607b9a39ef7ec3e13cbc"} Sep 30 13:39:54 crc kubenswrapper[4763]: I0930 13:39:54.351541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9j8qz" event={"ID":"6bab0204-e2f4-4666-a525-ce8b8cea5f17","Type":"ContainerStarted","Data":"a633433713ee5823cb6f7470872be5102558d0f63f0d55f6e220e9a2fd9d0dee"} Sep 30 13:39:54 crc kubenswrapper[4763]: I0930 13:39:54.355814 4763 generic.go:334] "Generic (PLEG): container finished" podID="c69337fe-42df-4d48-8254-9408d35e644c" containerID="0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80" exitCode=0 Sep 30 13:39:54 crc kubenswrapper[4763]: I0930 13:39:54.355854 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4fzf" event={"ID":"c69337fe-42df-4d48-8254-9408d35e644c","Type":"ContainerDied","Data":"0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80"} Sep 30 13:39:54 crc kubenswrapper[4763]: I0930 13:39:54.373264 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9j8qz" podStartSLOduration=2.891607226 podStartE2EDuration="5.37324436s" podCreationTimestamp="2025-09-30 13:39:49 +0000 UTC" firstStartedPulling="2025-09-30 13:39:51.319950331 +0000 UTC m=+263.458510616" lastFinishedPulling="2025-09-30 13:39:53.801587465 +0000 UTC m=+265.940147750" observedRunningTime="2025-09-30 13:39:54.37249579 +0000 UTC m=+266.511056085" watchObservedRunningTime="2025-09-30 13:39:54.37324436 +0000 UTC m=+266.511804655" Sep 30 13:39:55 crc kubenswrapper[4763]: I0930 13:39:55.368446 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6brb" event={"ID":"adf74762-2792-4fe2-8ce5-e5e7c7f88469","Type":"ContainerStarted","Data":"cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1"} Sep 30 13:39:55 crc kubenswrapper[4763]: I0930 13:39:55.371619 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4fzf" event={"ID":"c69337fe-42df-4d48-8254-9408d35e644c","Type":"ContainerStarted","Data":"fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409"} Sep 30 13:39:55 crc kubenswrapper[4763]: I0930 13:39:55.414808 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h4fzf" podStartSLOduration=1.7656082020000001 podStartE2EDuration="4.414790401s" podCreationTimestamp="2025-09-30 13:39:51 +0000 UTC" firstStartedPulling="2025-09-30 13:39:52.326955225 +0000 UTC m=+264.465515510" lastFinishedPulling="2025-09-30 13:39:54.976137424 +0000 UTC m=+267.114697709" observedRunningTime="2025-09-30 13:39:55.414543634 +0000 UTC m=+267.553103929" watchObservedRunningTime="2025-09-30 13:39:55.414790401 +0000 UTC m=+267.553350686" Sep 30 13:39:56 crc kubenswrapper[4763]: I0930 13:39:56.380355 4763 generic.go:334] "Generic (PLEG): container finished" podID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerID="cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1" exitCode=0 Sep 30 13:39:56 crc kubenswrapper[4763]: I0930 13:39:56.380682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6brb" event={"ID":"adf74762-2792-4fe2-8ce5-e5e7c7f88469","Type":"ContainerDied","Data":"cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1"} Sep 30 13:39:56 crc kubenswrapper[4763]: I0930 13:39:56.380712 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6brb" event={"ID":"adf74762-2792-4fe2-8ce5-e5e7c7f88469","Type":"ContainerStarted","Data":"0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4"} Sep 30 13:39:56 crc kubenswrapper[4763]: I0930 13:39:56.397897 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b6brb" podStartSLOduration=1.903283144 podStartE2EDuration="4.397880811s" podCreationTimestamp="2025-09-30 13:39:52 +0000 UTC" firstStartedPulling="2025-09-30 13:39:53.345200538 +0000 UTC m=+265.483760823" lastFinishedPulling="2025-09-30 13:39:55.839798205 +0000 UTC m=+267.978358490" observedRunningTime="2025-09-30 13:39:56.395495317 +0000 UTC m=+268.534055602" watchObservedRunningTime="2025-09-30 13:39:56.397880811 +0000 UTC m=+268.536441096" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.453442 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.453505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.453563 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.454047 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.455117 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.456289 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.456442 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.464821 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.465084 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.469306 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.478328 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.478959 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.614345 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.648012 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:57 crc kubenswrapper[4763]: I0930 13:39:57.662355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:57 crc kubenswrapper[4763]: W0930 13:39:57.885715 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-4e403a3af822629d03ff0a500ee7e05dca2df0c4eb9cf339cb1eee80a52bd6bc WatchSource:0}: Error finding container 4e403a3af822629d03ff0a500ee7e05dca2df0c4eb9cf339cb1eee80a52bd6bc: Status 404 returned error can't find the container with id 4e403a3af822629d03ff0a500ee7e05dca2df0c4eb9cf339cb1eee80a52bd6bc Sep 30 13:39:58 crc kubenswrapper[4763]: I0930 13:39:58.390685 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0db18063efce5ccdb3b51e293eb6790e3cdc632f743e107cb0fe3719ec06caab"} Sep 30 13:39:58 crc kubenswrapper[4763]: I0930 13:39:58.391260 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4e403a3af822629d03ff0a500ee7e05dca2df0c4eb9cf339cb1eee80a52bd6bc"} Sep 30 13:39:58 crc kubenswrapper[4763]: I0930 13:39:58.392254 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"53e03b3805118e12194380cf0ecc9d5ad9d683ddcb1d65de1fb3a124183729d4"} Sep 30 13:39:58 crc kubenswrapper[4763]: I0930 13:39:58.392286 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3a2931c082e12e3e29d8798ec983697d53b75a020540683410841c6acb95e066"} Sep 30 13:39:58 crc kubenswrapper[4763]: I0930 13:39:58.395007 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"52c17ad8d02e0e6505c6396d1633c3eb43ba9adb270258388e887fb55d35477f"} Sep 30 13:39:58 crc kubenswrapper[4763]: I0930 13:39:58.395061 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f688847913c7b9100e39d56b77de7d65c842ba9d38fa54c5c25a8f74bef3a1aa"} Sep 30 13:39:58 crc kubenswrapper[4763]: I0930 13:39:58.395220 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:59 crc kubenswrapper[4763]: I0930 13:39:59.547845 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:59 crc kubenswrapper[4763]: I0930 13:39:59.548197 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:39:59 crc kubenswrapper[4763]: I0930 13:39:59.592199 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:40:00 crc kubenswrapper[4763]: I0930 13:40:00.148481 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:40:00 crc kubenswrapper[4763]: I0930 13:40:00.148913 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:40:00 crc kubenswrapper[4763]: I0930 13:40:00.210899 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:40:00 crc kubenswrapper[4763]: I0930 13:40:00.454130 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9j8qz" Sep 30 13:40:00 crc kubenswrapper[4763]: I0930 13:40:00.477483 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5nfg4" Sep 30 13:40:01 crc kubenswrapper[4763]: I0930 13:40:01.960887 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:40:01 crc kubenswrapper[4763]: I0930 13:40:01.962617 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:40:02 crc kubenswrapper[4763]: I0930 13:40:02.043546 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:40:02 crc kubenswrapper[4763]: I0930 13:40:02.524261 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 13:40:02 crc kubenswrapper[4763]: I0930 13:40:02.545458 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:40:02 crc kubenswrapper[4763]: I0930 13:40:02.545509 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:40:02 crc kubenswrapper[4763]: I0930 13:40:02.593623 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:40:03 crc kubenswrapper[4763]: I0930 13:40:03.527407 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b6brb" Sep 30 13:40:37 crc kubenswrapper[4763]: I0930 13:40:37.667870 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:06 crc kubenswrapper[4763]: I0930 13:41:06.059724 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:41:06 crc kubenswrapper[4763]: I0930 13:41:06.060398 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:41:36 crc kubenswrapper[4763]: I0930 13:41:36.060155 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:41:36 crc kubenswrapper[4763]: I0930 13:41:36.060859 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:42:06 crc kubenswrapper[4763]: I0930 13:42:06.060045 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:42:06 crc kubenswrapper[4763]: I0930 13:42:06.060529 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:42:06 crc kubenswrapper[4763]: I0930 13:42:06.060578 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:42:06 crc kubenswrapper[4763]: I0930 13:42:06.061184 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e86f4169f74235b6e40ac7fe666fe2e530464ddaaf4bcda5a2f4e63d77e25c9"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:42:06 crc kubenswrapper[4763]: I0930 13:42:06.061248 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://1e86f4169f74235b6e40ac7fe666fe2e530464ddaaf4bcda5a2f4e63d77e25c9" gracePeriod=600 Sep 30 13:42:06 crc kubenswrapper[4763]: I0930 13:42:06.277908 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="1e86f4169f74235b6e40ac7fe666fe2e530464ddaaf4bcda5a2f4e63d77e25c9" exitCode=0 Sep 30 13:42:06 crc kubenswrapper[4763]: I0930 13:42:06.278109 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"1e86f4169f74235b6e40ac7fe666fe2e530464ddaaf4bcda5a2f4e63d77e25c9"} Sep 30 13:42:06 crc kubenswrapper[4763]: I0930 13:42:06.278276 4763 scope.go:117] "RemoveContainer" containerID="31801da25196b577850e7d0fb77c1e568e2512d921e2cb6159aca9a4b7e72eaa" Sep 30 13:42:07 crc kubenswrapper[4763]: I0930 13:42:07.285738 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"6a835f68fa095d0605d9b01f19066aa12d7ae1a68f6f7ff31a2cdf8fb87d2cb8"} Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.422055 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rbzvf"] Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.423454 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.525327 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rbzvf"] Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.536052 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42549d2a-1961-4097-bbea-1ad2eeb41719-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.536110 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42549d2a-1961-4097-bbea-1ad2eeb41719-registry-tls\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.536145 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwmgt\" (UniqueName: \"kubernetes.io/projected/42549d2a-1961-4097-bbea-1ad2eeb41719-kube-api-access-gwmgt\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.536162 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42549d2a-1961-4097-bbea-1ad2eeb41719-bound-sa-token\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.536186 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42549d2a-1961-4097-bbea-1ad2eeb41719-registry-certificates\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.536207 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42549d2a-1961-4097-bbea-1ad2eeb41719-trusted-ca\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.536232 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.536255 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42549d2a-1961-4097-bbea-1ad2eeb41719-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.565371 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.636992 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwmgt\" (UniqueName: \"kubernetes.io/projected/42549d2a-1961-4097-bbea-1ad2eeb41719-kube-api-access-gwmgt\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.637040 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42549d2a-1961-4097-bbea-1ad2eeb41719-bound-sa-token\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.637075 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42549d2a-1961-4097-bbea-1ad2eeb41719-registry-certificates\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.637099 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42549d2a-1961-4097-bbea-1ad2eeb41719-trusted-ca\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.637130 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42549d2a-1961-4097-bbea-1ad2eeb41719-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.637168 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42549d2a-1961-4097-bbea-1ad2eeb41719-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.637210 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42549d2a-1961-4097-bbea-1ad2eeb41719-registry-tls\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.637762 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42549d2a-1961-4097-bbea-1ad2eeb41719-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.638349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42549d2a-1961-4097-bbea-1ad2eeb41719-trusted-ca\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.638349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42549d2a-1961-4097-bbea-1ad2eeb41719-registry-certificates\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.645259 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42549d2a-1961-4097-bbea-1ad2eeb41719-registry-tls\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.652040 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42549d2a-1961-4097-bbea-1ad2eeb41719-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.656496 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42549d2a-1961-4097-bbea-1ad2eeb41719-bound-sa-token\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.659949 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwmgt\" (UniqueName: \"kubernetes.io/projected/42549d2a-1961-4097-bbea-1ad2eeb41719-kube-api-access-gwmgt\") pod \"image-registry-66df7c8f76-rbzvf\" (UID: \"42549d2a-1961-4097-bbea-1ad2eeb41719\") " pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:37 crc kubenswrapper[4763]: I0930 13:42:37.737717 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:38 crc kubenswrapper[4763]: I0930 13:42:38.138714 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rbzvf"] Sep 30 13:42:38 crc kubenswrapper[4763]: I0930 13:42:38.528067 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" event={"ID":"42549d2a-1961-4097-bbea-1ad2eeb41719","Type":"ContainerStarted","Data":"da9800f1bd393d84a4b35c49ae93afae401530c6d6b412d41f233ee2ccb71fb3"} Sep 30 13:42:38 crc kubenswrapper[4763]: I0930 13:42:38.528115 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" event={"ID":"42549d2a-1961-4097-bbea-1ad2eeb41719","Type":"ContainerStarted","Data":"2d2d15c2e76c21cd864df73907db314d288e3e4d858b5c420750ac94fe7e5e43"} Sep 30 13:42:38 crc kubenswrapper[4763]: I0930 13:42:38.528290 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:38 crc kubenswrapper[4763]: I0930 13:42:38.552681 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" podStartSLOduration=1.5526632500000002 podStartE2EDuration="1.55266325s" podCreationTimestamp="2025-09-30 13:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:42:38.54819307 +0000 UTC m=+430.686753355" watchObservedRunningTime="2025-09-30 13:42:38.55266325 +0000 UTC m=+430.691223555" Sep 30 13:42:57 crc kubenswrapper[4763]: I0930 13:42:57.748766 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-rbzvf" Sep 30 13:42:57 crc kubenswrapper[4763]: I0930 13:42:57.805003 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jmpjx"] Sep 30 13:43:22 crc kubenswrapper[4763]: I0930 13:43:22.852240 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" podUID="5b970ab9-2ae4-48ea-a4a2-db0e890a156a" containerName="registry" containerID="cri-o://db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64" gracePeriod=30 Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.238648 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.401690 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-bound-sa-token\") pod \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.401734 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-ca-trust-extracted\") pod \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.401756 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-certificates\") pod \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.401800 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-installation-pull-secrets\") pod \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.401831 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-trusted-ca\") pod \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.401862 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spbgf\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-kube-api-access-spbgf\") pod \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.401903 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-tls\") pod \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.402055 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\" (UID: \"5b970ab9-2ae4-48ea-a4a2-db0e890a156a\") " Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.403165 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5b970ab9-2ae4-48ea-a4a2-db0e890a156a" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.403450 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5b970ab9-2ae4-48ea-a4a2-db0e890a156a" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.410199 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5b970ab9-2ae4-48ea-a4a2-db0e890a156a" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.413712 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-kube-api-access-spbgf" (OuterVolumeSpecName: "kube-api-access-spbgf") pod "5b970ab9-2ae4-48ea-a4a2-db0e890a156a" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a"). InnerVolumeSpecName "kube-api-access-spbgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.414083 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5b970ab9-2ae4-48ea-a4a2-db0e890a156a" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.414251 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5b970ab9-2ae4-48ea-a4a2-db0e890a156a" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.421273 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5b970ab9-2ae4-48ea-a4a2-db0e890a156a" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.440494 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5b970ab9-2ae4-48ea-a4a2-db0e890a156a" (UID: "5b970ab9-2ae4-48ea-a4a2-db0e890a156a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.504440 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.504518 4763 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.504548 4763 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.504580 4763 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.504651 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.504679 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spbgf\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-kube-api-access-spbgf\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.504706 4763 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b970ab9-2ae4-48ea-a4a2-db0e890a156a-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.789404 4763 generic.go:334] "Generic (PLEG): container finished" podID="5b970ab9-2ae4-48ea-a4a2-db0e890a156a" containerID="db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64" exitCode=0 Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.789451 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" event={"ID":"5b970ab9-2ae4-48ea-a4a2-db0e890a156a","Type":"ContainerDied","Data":"db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64"} Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.789476 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" event={"ID":"5b970ab9-2ae4-48ea-a4a2-db0e890a156a","Type":"ContainerDied","Data":"0201642ab442770a1bbcdec4f0c1acc3aab597810d91687239492fb9cefa0344"} Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.789491 4763 scope.go:117] "RemoveContainer" containerID="db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.789490 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jmpjx" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.813863 4763 scope.go:117] "RemoveContainer" containerID="db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64" Sep 30 13:43:23 crc kubenswrapper[4763]: E0930 13:43:23.814676 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64\": container with ID starting with db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64 not found: ID does not exist" containerID="db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.814794 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64"} err="failed to get container status \"db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64\": rpc error: code = NotFound desc = could not find container \"db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64\": container with ID starting with db887efa0075b3184d1f18dce6c383cc59d8545e37eac4104ec780383e191e64 not found: ID does not exist" Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.834918 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jmpjx"] Sep 30 13:43:23 crc kubenswrapper[4763]: I0930 13:43:23.838123 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jmpjx"] Sep 30 13:43:24 crc kubenswrapper[4763]: I0930 13:43:24.497734 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b970ab9-2ae4-48ea-a4a2-db0e890a156a" path="/var/lib/kubelet/pods/5b970ab9-2ae4-48ea-a4a2-db0e890a156a/volumes" Sep 30 13:44:06 crc kubenswrapper[4763]: I0930 13:44:06.059515 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:44:06 crc kubenswrapper[4763]: I0930 13:44:06.060814 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:44:36 crc kubenswrapper[4763]: I0930 13:44:36.060007 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:44:36 crc kubenswrapper[4763]: I0930 13:44:36.060761 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.139930 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75"] Sep 30 13:45:00 crc kubenswrapper[4763]: E0930 13:45:00.140740 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b970ab9-2ae4-48ea-a4a2-db0e890a156a" containerName="registry" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.140756 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b970ab9-2ae4-48ea-a4a2-db0e890a156a" containerName="registry" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.140868 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b970ab9-2ae4-48ea-a4a2-db0e890a156a" containerName="registry" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.141305 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.142085 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr849\" (UniqueName: \"kubernetes.io/projected/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-kube-api-access-fr849\") pod \"collect-profiles-29320665-cwm75\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.142129 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-config-volume\") pod \"collect-profiles-29320665-cwm75\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.142153 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-secret-volume\") pod \"collect-profiles-29320665-cwm75\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.143420 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.144215 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.152520 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75"] Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.243377 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr849\" (UniqueName: \"kubernetes.io/projected/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-kube-api-access-fr849\") pod \"collect-profiles-29320665-cwm75\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.243474 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-config-volume\") pod \"collect-profiles-29320665-cwm75\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.243694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-secret-volume\") pod \"collect-profiles-29320665-cwm75\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.245394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-config-volume\") pod \"collect-profiles-29320665-cwm75\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.250634 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-secret-volume\") pod \"collect-profiles-29320665-cwm75\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.263132 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr849\" (UniqueName: \"kubernetes.io/projected/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-kube-api-access-fr849\") pod \"collect-profiles-29320665-cwm75\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.463257 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:00 crc kubenswrapper[4763]: I0930 13:45:00.883388 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75"] Sep 30 13:45:00 crc kubenswrapper[4763]: W0930 13:45:00.892713 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ca1ac89_3e97_43f0_a8a1_4b9dd101887d.slice/crio-d85f6c18113864bb123a05cc12878910599ed26d821595517780d082cc2010b8 WatchSource:0}: Error finding container d85f6c18113864bb123a05cc12878910599ed26d821595517780d082cc2010b8: Status 404 returned error can't find the container with id d85f6c18113864bb123a05cc12878910599ed26d821595517780d082cc2010b8 Sep 30 13:45:01 crc kubenswrapper[4763]: I0930 13:45:01.415783 4763 generic.go:334] "Generic (PLEG): container finished" podID="6ca1ac89-3e97-43f0-a8a1-4b9dd101887d" containerID="1b53fb21c0f15bb62c3da5c0ff83299537f6c2a687b868630fb8ca957e6df5ab" exitCode=0 Sep 30 13:45:01 crc kubenswrapper[4763]: I0930 13:45:01.415942 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" event={"ID":"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d","Type":"ContainerDied","Data":"1b53fb21c0f15bb62c3da5c0ff83299537f6c2a687b868630fb8ca957e6df5ab"} Sep 30 13:45:01 crc kubenswrapper[4763]: I0930 13:45:01.416488 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" event={"ID":"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d","Type":"ContainerStarted","Data":"d85f6c18113864bb123a05cc12878910599ed26d821595517780d082cc2010b8"} Sep 30 13:45:02 crc kubenswrapper[4763]: I0930 13:45:02.637731 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:02 crc kubenswrapper[4763]: I0930 13:45:02.671973 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr849\" (UniqueName: \"kubernetes.io/projected/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-kube-api-access-fr849\") pod \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " Sep 30 13:45:02 crc kubenswrapper[4763]: I0930 13:45:02.672037 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-config-volume\") pod \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " Sep 30 13:45:02 crc kubenswrapper[4763]: I0930 13:45:02.672076 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-secret-volume\") pod \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\" (UID: \"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d\") " Sep 30 13:45:02 crc kubenswrapper[4763]: I0930 13:45:02.672908 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-config-volume" (OuterVolumeSpecName: "config-volume") pod "6ca1ac89-3e97-43f0-a8a1-4b9dd101887d" (UID: "6ca1ac89-3e97-43f0-a8a1-4b9dd101887d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:45:02 crc kubenswrapper[4763]: I0930 13:45:02.678069 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-kube-api-access-fr849" (OuterVolumeSpecName: "kube-api-access-fr849") pod "6ca1ac89-3e97-43f0-a8a1-4b9dd101887d" (UID: "6ca1ac89-3e97-43f0-a8a1-4b9dd101887d"). InnerVolumeSpecName "kube-api-access-fr849". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:45:02 crc kubenswrapper[4763]: I0930 13:45:02.678099 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6ca1ac89-3e97-43f0-a8a1-4b9dd101887d" (UID: "6ca1ac89-3e97-43f0-a8a1-4b9dd101887d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:45:02 crc kubenswrapper[4763]: I0930 13:45:02.772867 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr849\" (UniqueName: \"kubernetes.io/projected/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-kube-api-access-fr849\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:02 crc kubenswrapper[4763]: I0930 13:45:02.772895 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:02 crc kubenswrapper[4763]: I0930 13:45:02.772904 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:03 crc kubenswrapper[4763]: I0930 13:45:03.433447 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" event={"ID":"6ca1ac89-3e97-43f0-a8a1-4b9dd101887d","Type":"ContainerDied","Data":"d85f6c18113864bb123a05cc12878910599ed26d821595517780d082cc2010b8"} Sep 30 13:45:03 crc kubenswrapper[4763]: I0930 13:45:03.433522 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75" Sep 30 13:45:03 crc kubenswrapper[4763]: I0930 13:45:03.433575 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d85f6c18113864bb123a05cc12878910599ed26d821595517780d082cc2010b8" Sep 30 13:45:06 crc kubenswrapper[4763]: I0930 13:45:06.060000 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:45:06 crc kubenswrapper[4763]: I0930 13:45:06.060311 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:45:06 crc kubenswrapper[4763]: I0930 13:45:06.060380 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:45:06 crc kubenswrapper[4763]: I0930 13:45:06.061118 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a835f68fa095d0605d9b01f19066aa12d7ae1a68f6f7ff31a2cdf8fb87d2cb8"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:45:06 crc kubenswrapper[4763]: I0930 13:45:06.061202 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://6a835f68fa095d0605d9b01f19066aa12d7ae1a68f6f7ff31a2cdf8fb87d2cb8" gracePeriod=600 Sep 30 13:45:06 crc kubenswrapper[4763]: I0930 13:45:06.457768 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="6a835f68fa095d0605d9b01f19066aa12d7ae1a68f6f7ff31a2cdf8fb87d2cb8" exitCode=0 Sep 30 13:45:06 crc kubenswrapper[4763]: I0930 13:45:06.457862 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"6a835f68fa095d0605d9b01f19066aa12d7ae1a68f6f7ff31a2cdf8fb87d2cb8"} Sep 30 13:45:06 crc kubenswrapper[4763]: I0930 13:45:06.458529 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"f66002987a3e708ee53022f61f57bc4019ea893682ccce020d3b5027a63c2bf8"} Sep 30 13:45:06 crc kubenswrapper[4763]: I0930 13:45:06.458588 4763 scope.go:117] "RemoveContainer" containerID="1e86f4169f74235b6e40ac7fe666fe2e530464ddaaf4bcda5a2f4e63d77e25c9" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.188628 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5rtn6"] Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.192063 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovn-controller" containerID="cri-o://c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f" gracePeriod=30 Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.192382 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda" gracePeriod=30 Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.192469 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="kube-rbac-proxy-node" containerID="cri-o://6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371" gracePeriod=30 Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.192461 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="northd" containerID="cri-o://a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b" gracePeriod=30 Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.192628 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="nbdb" containerID="cri-o://ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2" gracePeriod=30 Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.192691 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovn-acl-logging" containerID="cri-o://5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2" gracePeriod=30 Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.193404 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="sbdb" containerID="cri-o://3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f" gracePeriod=30 Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.225342 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" containerID="cri-o://394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91" gracePeriod=30 Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.459357 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/3.log" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.462056 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovn-acl-logging/0.log" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.462551 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovn-controller/0.log" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.463182 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.514636 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ft8fb"] Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515086 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515112 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515140 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515149 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515160 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="kube-rbac-proxy-node" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515169 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="kube-rbac-proxy-node" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515181 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovn-acl-logging" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515189 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovn-acl-logging" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515201 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="sbdb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515227 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="sbdb" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515238 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="northd" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515245 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="northd" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515256 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515263 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515271 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="kubecfg-setup" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515282 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="kubecfg-setup" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515310 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovn-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515318 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovn-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515328 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="nbdb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515335 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="nbdb" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515342 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515349 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515359 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca1ac89-3e97-43f0-a8a1-4b9dd101887d" containerName="collect-profiles" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515367 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca1ac89-3e97-43f0-a8a1-4b9dd101887d" containerName="collect-profiles" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515393 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515400 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515530 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515557 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515565 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="kube-rbac-proxy-node" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515575 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="nbdb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515584 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515593 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ca1ac89-3e97-43f0-a8a1-4b9dd101887d" containerName="collect-profiles" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515625 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovn-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515634 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="sbdb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515646 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515655 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="northd" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515663 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovn-acl-logging" Sep 30 13:46:49 crc kubenswrapper[4763]: E0930 13:46:49.515806 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515816 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.515951 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.516203 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerName="ovnkube-controller" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.518594 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629012 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-netns\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629103 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-etc-openvswitch\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629163 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-slash\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629189 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-log-socket\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629216 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-bin\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629237 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629211 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629265 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-config\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629292 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-slash" (OuterVolumeSpecName: "host-slash") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629302 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-log-socket" (OuterVolumeSpecName: "log-socket") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629362 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-var-lib-cni-networks-ovn-kubernetes\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629398 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6prbc\" (UniqueName: \"kubernetes.io/projected/da518be6-b52d-4130-aab2-f27bfd4f9571-kube-api-access-6prbc\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629389 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629456 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629471 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-netd\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-systemd\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629489 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629570 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-var-lib-openvswitch\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629590 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-kubelet\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629628 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da518be6-b52d-4130-aab2-f27bfd4f9571-ovn-node-metrics-cert\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629636 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629652 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-openvswitch\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629674 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629713 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629695 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-ovn\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629738 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629800 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-ovn-kubernetes\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629867 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-env-overrides\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629895 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629914 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-systemd-units\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629928 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629967 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-script-lib\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.629981 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630006 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-node-log\") pod \"da518be6-b52d-4130-aab2-f27bfd4f9571\" (UID: \"da518be6-b52d-4130-aab2-f27bfd4f9571\") " Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630155 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-node-log" (OuterVolumeSpecName: "node-log") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630244 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630271 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-node-log\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630363 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-ovnkube-config\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630426 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-env-overrides\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630481 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-var-lib-openvswitch\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630536 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pttfv\" (UniqueName: \"kubernetes.io/projected/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-kube-api-access-pttfv\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630572 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-run-systemd\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630638 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630826 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-run-ovn\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630949 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-run-netns\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.630998 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.631037 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-run-ovn-kubernetes\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.631151 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-ovnkube-script-lib\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.631357 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-slash\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.631407 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-cni-netd\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.631435 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-cni-bin\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.631469 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-log-socket\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.631500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-ovn-node-metrics-cert\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632378 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-run-openvswitch\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632683 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-systemd-units\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632742 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-kubelet\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632791 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-etc-openvswitch\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632930 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632947 4763 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632958 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632969 4763 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632977 4763 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632987 4763 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.632995 4763 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.633003 4763 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.633011 4763 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.633023 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da518be6-b52d-4130-aab2-f27bfd4f9571-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.633033 4763 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.633048 4763 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.633058 4763 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.633066 4763 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.633074 4763 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.633083 4763 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.633093 4763 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.635983 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da518be6-b52d-4130-aab2-f27bfd4f9571-kube-api-access-6prbc" (OuterVolumeSpecName: "kube-api-access-6prbc") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "kube-api-access-6prbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.636270 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da518be6-b52d-4130-aab2-f27bfd4f9571-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.646323 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "da518be6-b52d-4130-aab2-f27bfd4f9571" (UID: "da518be6-b52d-4130-aab2-f27bfd4f9571"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.733812 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-run-ovn\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.733881 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-run-netns\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.733910 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.733935 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-run-ovn-kubernetes\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.733936 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-run-ovn\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.733972 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-ovnkube-script-lib\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.733993 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734000 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-slash\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734021 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-cni-netd\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734044 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-cni-bin\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734069 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-log-socket\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734090 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-ovn-node-metrics-cert\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734085 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-run-netns\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734144 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-cni-netd\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734116 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-run-openvswitch\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734130 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-slash\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734156 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-run-openvswitch\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-systemd-units\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734076 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-run-ovn-kubernetes\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-cni-bin\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734256 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-kubelet\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734186 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-log-socket\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734305 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-etc-openvswitch\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734317 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-systemd-units\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734376 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-node-log\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734372 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-etc-openvswitch\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734408 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-ovnkube-config\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734423 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-host-kubelet\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734455 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-node-log\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734580 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-env-overrides\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734645 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-ovnkube-script-lib\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734658 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-var-lib-openvswitch\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734699 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pttfv\" (UniqueName: \"kubernetes.io/projected/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-kube-api-access-pttfv\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734723 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-run-systemd\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734769 4763 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da518be6-b52d-4130-aab2-f27bfd4f9571-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734783 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da518be6-b52d-4130-aab2-f27bfd4f9571-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734796 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6prbc\" (UniqueName: \"kubernetes.io/projected/da518be6-b52d-4130-aab2-f27bfd4f9571-kube-api-access-6prbc\") on node \"crc\" DevicePath \"\"" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734804 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-var-lib-openvswitch\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734826 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-run-systemd\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.734970 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-env-overrides\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.735168 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-ovnkube-config\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.738374 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-ovn-node-metrics-cert\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.750796 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pttfv\" (UniqueName: \"kubernetes.io/projected/1682b055-2e5d-4fea-b9ae-d12a24b9ff50-kube-api-access-pttfv\") pod \"ovnkube-node-ft8fb\" (UID: \"1682b055-2e5d-4fea-b9ae-d12a24b9ff50\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:49 crc kubenswrapper[4763]: I0930 13:46:49.835142 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.092428 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9qpw_766e1024-d943-4721-a366-83bc3635cc79/kube-multus/2.log" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.093816 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9qpw_766e1024-d943-4721-a366-83bc3635cc79/kube-multus/1.log" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.093952 4763 generic.go:334] "Generic (PLEG): container finished" podID="766e1024-d943-4721-a366-83bc3635cc79" containerID="5b43269fae80af1d4f4436c691aca5e5984ef49e50d5581da67884a6052cbef2" exitCode=2 Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.094012 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9qpw" event={"ID":"766e1024-d943-4721-a366-83bc3635cc79","Type":"ContainerDied","Data":"5b43269fae80af1d4f4436c691aca5e5984ef49e50d5581da67884a6052cbef2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.094183 4763 scope.go:117] "RemoveContainer" containerID="2dc6bde7a5880048f5d3ea37b60d99dbdaf19713202bb1e9a214c546227dd37e" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.095209 4763 scope.go:117] "RemoveContainer" containerID="5b43269fae80af1d4f4436c691aca5e5984ef49e50d5581da67884a6052cbef2" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.095684 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-c9qpw_openshift-multus(766e1024-d943-4721-a366-83bc3635cc79)\"" pod="openshift-multus/multus-c9qpw" podUID="766e1024-d943-4721-a366-83bc3635cc79" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.096779 4763 generic.go:334] "Generic (PLEG): container finished" podID="1682b055-2e5d-4fea-b9ae-d12a24b9ff50" containerID="0263efedb67cee160ba706b65913f5df5da888ebef8d8884d2ddfab1019d29c5" exitCode=0 Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.096935 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" event={"ID":"1682b055-2e5d-4fea-b9ae-d12a24b9ff50","Type":"ContainerDied","Data":"0263efedb67cee160ba706b65913f5df5da888ebef8d8884d2ddfab1019d29c5"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.097051 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" event={"ID":"1682b055-2e5d-4fea-b9ae-d12a24b9ff50","Type":"ContainerStarted","Data":"b306ac4c7e23684c58bc5cc11cfc1539935164be6f7c2a89583fcf960ea44b47"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.104261 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovnkube-controller/3.log" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.107786 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovn-acl-logging/0.log" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.108787 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rtn6_da518be6-b52d-4130-aab2-f27bfd4f9571/ovn-controller/0.log" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109468 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91" exitCode=0 Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109503 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f" exitCode=0 Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109518 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2" exitCode=0 Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109532 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b" exitCode=0 Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109541 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda" exitCode=0 Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109551 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371" exitCode=0 Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109533 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109563 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2" exitCode=143 Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109653 4763 generic.go:334] "Generic (PLEG): container finished" podID="da518be6-b52d-4130-aab2-f27bfd4f9571" containerID="c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f" exitCode=143 Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109661 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109711 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109740 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109800 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109830 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109859 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109878 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109896 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109914 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109934 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109952 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109968 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.109985 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110001 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110022 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110047 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110066 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110083 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110099 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110117 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110135 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110151 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110167 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110183 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110198 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110219 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110244 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110263 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110280 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110297 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110314 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110330 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110349 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110365 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110381 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110399 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110426 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" event={"ID":"da518be6-b52d-4130-aab2-f27bfd4f9571","Type":"ContainerDied","Data":"5333e1ff3aab79ac4c2fdc5dc93f3594e01c428fc9d4d8b30708ede2bf254cc7"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110451 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110469 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110486 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110502 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110518 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110533 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110550 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110566 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110583 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.110634 4763 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11"} Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.111466 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5rtn6" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.148168 4763 scope.go:117] "RemoveContainer" containerID="394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.173545 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.202671 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5rtn6"] Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.207150 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5rtn6"] Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.216628 4763 scope.go:117] "RemoveContainer" containerID="3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.235668 4763 scope.go:117] "RemoveContainer" containerID="ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.250059 4763 scope.go:117] "RemoveContainer" containerID="a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.278980 4763 scope.go:117] "RemoveContainer" containerID="b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.298465 4763 scope.go:117] "RemoveContainer" containerID="6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.337826 4763 scope.go:117] "RemoveContainer" containerID="5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.354969 4763 scope.go:117] "RemoveContainer" containerID="c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.373201 4763 scope.go:117] "RemoveContainer" containerID="e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.386951 4763 scope.go:117] "RemoveContainer" containerID="394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.387770 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91\": container with ID starting with 394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91 not found: ID does not exist" containerID="394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.387812 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91"} err="failed to get container status \"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91\": rpc error: code = NotFound desc = could not find container \"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91\": container with ID starting with 394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.387842 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.388292 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\": container with ID starting with 8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158 not found: ID does not exist" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.388385 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158"} err="failed to get container status \"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\": rpc error: code = NotFound desc = could not find container \"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\": container with ID starting with 8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.388476 4763 scope.go:117] "RemoveContainer" containerID="3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.388942 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\": container with ID starting with 3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f not found: ID does not exist" containerID="3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.388966 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f"} err="failed to get container status \"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\": rpc error: code = NotFound desc = could not find container \"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\": container with ID starting with 3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.388980 4763 scope.go:117] "RemoveContainer" containerID="ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.389348 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\": container with ID starting with ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2 not found: ID does not exist" containerID="ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.389369 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2"} err="failed to get container status \"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\": rpc error: code = NotFound desc = could not find container \"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\": container with ID starting with ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.389381 4763 scope.go:117] "RemoveContainer" containerID="a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.389689 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\": container with ID starting with a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b not found: ID does not exist" containerID="a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.389713 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b"} err="failed to get container status \"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\": rpc error: code = NotFound desc = could not find container \"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\": container with ID starting with a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.389727 4763 scope.go:117] "RemoveContainer" containerID="b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.389983 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\": container with ID starting with b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda not found: ID does not exist" containerID="b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.390058 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda"} err="failed to get container status \"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\": rpc error: code = NotFound desc = could not find container \"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\": container with ID starting with b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.390123 4763 scope.go:117] "RemoveContainer" containerID="6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.390503 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\": container with ID starting with 6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371 not found: ID does not exist" containerID="6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.390525 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371"} err="failed to get container status \"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\": rpc error: code = NotFound desc = could not find container \"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\": container with ID starting with 6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.390541 4763 scope.go:117] "RemoveContainer" containerID="5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.390866 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\": container with ID starting with 5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2 not found: ID does not exist" containerID="5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.390937 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2"} err="failed to get container status \"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\": rpc error: code = NotFound desc = could not find container \"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\": container with ID starting with 5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.391002 4763 scope.go:117] "RemoveContainer" containerID="c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.391331 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\": container with ID starting with c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f not found: ID does not exist" containerID="c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.391378 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f"} err="failed to get container status \"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\": rpc error: code = NotFound desc = could not find container \"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\": container with ID starting with c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.391451 4763 scope.go:117] "RemoveContainer" containerID="e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11" Sep 30 13:46:50 crc kubenswrapper[4763]: E0930 13:46:50.391883 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\": container with ID starting with e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11 not found: ID does not exist" containerID="e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.391910 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11"} err="failed to get container status \"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\": rpc error: code = NotFound desc = could not find container \"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\": container with ID starting with e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.391930 4763 scope.go:117] "RemoveContainer" containerID="394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.392233 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91"} err="failed to get container status \"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91\": rpc error: code = NotFound desc = could not find container \"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91\": container with ID starting with 394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.392255 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.392665 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158"} err="failed to get container status \"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\": rpc error: code = NotFound desc = could not find container \"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\": container with ID starting with 8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.392761 4763 scope.go:117] "RemoveContainer" containerID="3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.393166 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f"} err="failed to get container status \"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\": rpc error: code = NotFound desc = could not find container \"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\": container with ID starting with 3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.393255 4763 scope.go:117] "RemoveContainer" containerID="ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.393580 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2"} err="failed to get container status \"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\": rpc error: code = NotFound desc = could not find container \"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\": container with ID starting with ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.393641 4763 scope.go:117] "RemoveContainer" containerID="a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.394094 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b"} err="failed to get container status \"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\": rpc error: code = NotFound desc = could not find container \"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\": container with ID starting with a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.394138 4763 scope.go:117] "RemoveContainer" containerID="b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.394423 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda"} err="failed to get container status \"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\": rpc error: code = NotFound desc = could not find container \"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\": container with ID starting with b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.394443 4763 scope.go:117] "RemoveContainer" containerID="6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.394701 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371"} err="failed to get container status \"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\": rpc error: code = NotFound desc = could not find container \"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\": container with ID starting with 6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.394803 4763 scope.go:117] "RemoveContainer" containerID="5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.395251 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2"} err="failed to get container status \"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\": rpc error: code = NotFound desc = could not find container \"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\": container with ID starting with 5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.395298 4763 scope.go:117] "RemoveContainer" containerID="c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.395542 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f"} err="failed to get container status \"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\": rpc error: code = NotFound desc = could not find container \"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\": container with ID starting with c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.395619 4763 scope.go:117] "RemoveContainer" containerID="e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.395905 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11"} err="failed to get container status \"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\": rpc error: code = NotFound desc = could not find container \"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\": container with ID starting with e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.396002 4763 scope.go:117] "RemoveContainer" containerID="394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.396404 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91"} err="failed to get container status \"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91\": rpc error: code = NotFound desc = could not find container \"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91\": container with ID starting with 394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.396428 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.396727 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158"} err="failed to get container status \"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\": rpc error: code = NotFound desc = could not find container \"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\": container with ID starting with 8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.396747 4763 scope.go:117] "RemoveContainer" containerID="3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.397003 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f"} err="failed to get container status \"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\": rpc error: code = NotFound desc = could not find container \"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\": container with ID starting with 3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.397095 4763 scope.go:117] "RemoveContainer" containerID="ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.397460 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2"} err="failed to get container status \"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\": rpc error: code = NotFound desc = could not find container \"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\": container with ID starting with ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.397503 4763 scope.go:117] "RemoveContainer" containerID="a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.397825 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b"} err="failed to get container status \"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\": rpc error: code = NotFound desc = could not find container \"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\": container with ID starting with a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.397845 4763 scope.go:117] "RemoveContainer" containerID="b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.398114 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda"} err="failed to get container status \"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\": rpc error: code = NotFound desc = could not find container \"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\": container with ID starting with b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.398310 4763 scope.go:117] "RemoveContainer" containerID="6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.398748 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371"} err="failed to get container status \"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\": rpc error: code = NotFound desc = could not find container \"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\": container with ID starting with 6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.398763 4763 scope.go:117] "RemoveContainer" containerID="5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.399074 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2"} err="failed to get container status \"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\": rpc error: code = NotFound desc = could not find container \"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\": container with ID starting with 5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.399092 4763 scope.go:117] "RemoveContainer" containerID="c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.399638 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f"} err="failed to get container status \"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\": rpc error: code = NotFound desc = could not find container \"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\": container with ID starting with c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.399665 4763 scope.go:117] "RemoveContainer" containerID="e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.399988 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11"} err="failed to get container status \"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\": rpc error: code = NotFound desc = could not find container \"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\": container with ID starting with e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.400232 4763 scope.go:117] "RemoveContainer" containerID="394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.400773 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91"} err="failed to get container status \"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91\": rpc error: code = NotFound desc = could not find container \"394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91\": container with ID starting with 394fc3c012454f46b5688389617e2a2f6892f9c138fbf53df982baee9ae3be91 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.400792 4763 scope.go:117] "RemoveContainer" containerID="8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.401222 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158"} err="failed to get container status \"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\": rpc error: code = NotFound desc = could not find container \"8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158\": container with ID starting with 8ad8abb3a18ac67d26ed1668c12b46523970fcbb741bf3c9c5d599bf67891158 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.401247 4763 scope.go:117] "RemoveContainer" containerID="3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.401588 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f"} err="failed to get container status \"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\": rpc error: code = NotFound desc = could not find container \"3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f\": container with ID starting with 3fe70d99edf3ced87f1890cf7654cf6804d9e37c07caeda70653a0454d6a014f not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.401640 4763 scope.go:117] "RemoveContainer" containerID="ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.401943 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2"} err="failed to get container status \"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\": rpc error: code = NotFound desc = could not find container \"ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2\": container with ID starting with ca4a37e7883112bd7a42d1a8a9adf0cf0069bd5d8bd6b42663219c1aa95021c2 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.401970 4763 scope.go:117] "RemoveContainer" containerID="a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.402494 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b"} err="failed to get container status \"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\": rpc error: code = NotFound desc = could not find container \"a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b\": container with ID starting with a7c6fb7e88747891d281814f7316267a712004b53e332ccecb283bee1eef2b9b not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.402544 4763 scope.go:117] "RemoveContainer" containerID="b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.402931 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda"} err="failed to get container status \"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\": rpc error: code = NotFound desc = could not find container \"b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda\": container with ID starting with b0b5359d6bdd4b9f3b8cb633c6b6ab39df1f19c09665724d0e514b654fbe8fda not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.402969 4763 scope.go:117] "RemoveContainer" containerID="6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.403344 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371"} err="failed to get container status \"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\": rpc error: code = NotFound desc = could not find container \"6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371\": container with ID starting with 6c2471c4c604afa2514abc51817107bb6087d2a51a75135acd53819c5bf8c371 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.403388 4763 scope.go:117] "RemoveContainer" containerID="5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.403837 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2"} err="failed to get container status \"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\": rpc error: code = NotFound desc = could not find container \"5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2\": container with ID starting with 5bd1e1e18fed396bee55109593b82215aaffcd5df796ef14c07b9faa50a924a2 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.404002 4763 scope.go:117] "RemoveContainer" containerID="c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.404434 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f"} err="failed to get container status \"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\": rpc error: code = NotFound desc = could not find container \"c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f\": container with ID starting with c1a8ddb3f1c4dff1be102780a3e5a728fb588d8365ed958e0f574ef51d4ea34f not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.404463 4763 scope.go:117] "RemoveContainer" containerID="e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.404947 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11"} err="failed to get container status \"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\": rpc error: code = NotFound desc = could not find container \"e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11\": container with ID starting with e99f2517596b9f56df14e9da42503c3ddc80831fb39a5efd51c929270fcaae11 not found: ID does not exist" Sep 30 13:46:50 crc kubenswrapper[4763]: I0930 13:46:50.495839 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da518be6-b52d-4130-aab2-f27bfd4f9571" path="/var/lib/kubelet/pods/da518be6-b52d-4130-aab2-f27bfd4f9571/volumes" Sep 30 13:46:51 crc kubenswrapper[4763]: I0930 13:46:51.118307 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9qpw_766e1024-d943-4721-a366-83bc3635cc79/kube-multus/2.log" Sep 30 13:46:51 crc kubenswrapper[4763]: I0930 13:46:51.125500 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" event={"ID":"1682b055-2e5d-4fea-b9ae-d12a24b9ff50","Type":"ContainerStarted","Data":"34111980d6fc8cc7377de0267f057e23992410825ad09715149bde51db436ed9"} Sep 30 13:46:51 crc kubenswrapper[4763]: I0930 13:46:51.125550 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" event={"ID":"1682b055-2e5d-4fea-b9ae-d12a24b9ff50","Type":"ContainerStarted","Data":"a0b5d2bfbe53745190b536754d928edd00bc68b9d5388d41017f0daa209643ba"} Sep 30 13:46:51 crc kubenswrapper[4763]: I0930 13:46:51.125564 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" event={"ID":"1682b055-2e5d-4fea-b9ae-d12a24b9ff50","Type":"ContainerStarted","Data":"fa290a870cf9c8bafbf3f897101904893621f7910bcb6340eaaae053bbad3849"} Sep 30 13:46:51 crc kubenswrapper[4763]: I0930 13:46:51.125575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" event={"ID":"1682b055-2e5d-4fea-b9ae-d12a24b9ff50","Type":"ContainerStarted","Data":"92450363d052799564cbb87fb8dae3000d85863d9c7fb21a3c644d64225d52f3"} Sep 30 13:46:51 crc kubenswrapper[4763]: I0930 13:46:51.125587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" event={"ID":"1682b055-2e5d-4fea-b9ae-d12a24b9ff50","Type":"ContainerStarted","Data":"9b644ab502fa9ebcc6a2cddb6c9a13b4dc93d21845760c3c098cb70d1d28c4b0"} Sep 30 13:46:51 crc kubenswrapper[4763]: I0930 13:46:51.125622 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" event={"ID":"1682b055-2e5d-4fea-b9ae-d12a24b9ff50","Type":"ContainerStarted","Data":"e1b8ca5cba9e73f170f4da9db36a50c3d8aadd4084cff5be7ba640e6c4c051b5"} Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.671029 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-94rm8"] Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.671846 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.673734 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.674088 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.676882 4763 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-jrp26" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.676987 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.873309 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvvcf\" (UniqueName: \"kubernetes.io/projected/4fae1a56-54ac-419f-8c7a-8230786d5188-kube-api-access-lvvcf\") pod \"crc-storage-crc-94rm8\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.873396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4fae1a56-54ac-419f-8c7a-8230786d5188-crc-storage\") pod \"crc-storage-crc-94rm8\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.873554 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4fae1a56-54ac-419f-8c7a-8230786d5188-node-mnt\") pod \"crc-storage-crc-94rm8\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.976614 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvvcf\" (UniqueName: \"kubernetes.io/projected/4fae1a56-54ac-419f-8c7a-8230786d5188-kube-api-access-lvvcf\") pod \"crc-storage-crc-94rm8\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.976699 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4fae1a56-54ac-419f-8c7a-8230786d5188-crc-storage\") pod \"crc-storage-crc-94rm8\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.976752 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4fae1a56-54ac-419f-8c7a-8230786d5188-node-mnt\") pod \"crc-storage-crc-94rm8\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.977030 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4fae1a56-54ac-419f-8c7a-8230786d5188-node-mnt\") pod \"crc-storage-crc-94rm8\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:52 crc kubenswrapper[4763]: I0930 13:46:52.978327 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4fae1a56-54ac-419f-8c7a-8230786d5188-crc-storage\") pod \"crc-storage-crc-94rm8\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:53 crc kubenswrapper[4763]: I0930 13:46:53.012450 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvvcf\" (UniqueName: \"kubernetes.io/projected/4fae1a56-54ac-419f-8c7a-8230786d5188-kube-api-access-lvvcf\") pod \"crc-storage-crc-94rm8\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:53 crc kubenswrapper[4763]: I0930 13:46:53.149961 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" event={"ID":"1682b055-2e5d-4fea-b9ae-d12a24b9ff50","Type":"ContainerStarted","Data":"726cc85b21a1e0bd5655a5089d84df8ce6659073faee75dca5486ee1a1dbe5b0"} Sep 30 13:46:53 crc kubenswrapper[4763]: I0930 13:46:53.292168 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:53 crc kubenswrapper[4763]: E0930 13:46:53.330678 4763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(71b03b671a078aba6c7da1001a51294453cc68e93eade40fe87ffa8588a66c19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 13:46:53 crc kubenswrapper[4763]: E0930 13:46:53.330770 4763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(71b03b671a078aba6c7da1001a51294453cc68e93eade40fe87ffa8588a66c19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:53 crc kubenswrapper[4763]: E0930 13:46:53.330811 4763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(71b03b671a078aba6c7da1001a51294453cc68e93eade40fe87ffa8588a66c19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:53 crc kubenswrapper[4763]: E0930 13:46:53.330872 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-94rm8_crc-storage(4fae1a56-54ac-419f-8c7a-8230786d5188)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-94rm8_crc-storage(4fae1a56-54ac-419f-8c7a-8230786d5188)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(71b03b671a078aba6c7da1001a51294453cc68e93eade40fe87ffa8588a66c19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-94rm8" podUID="4fae1a56-54ac-419f-8c7a-8230786d5188" Sep 30 13:46:56 crc kubenswrapper[4763]: I0930 13:46:56.136583 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-94rm8"] Sep 30 13:46:56 crc kubenswrapper[4763]: I0930 13:46:56.137018 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:56 crc kubenswrapper[4763]: I0930 13:46:56.137473 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:56 crc kubenswrapper[4763]: E0930 13:46:56.158646 4763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(4c0ddd4faa0f7ccc4354ac2dd5ca65bd824cd643796ff36cdb6a3c7e71c5c7f0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 13:46:56 crc kubenswrapper[4763]: E0930 13:46:56.158723 4763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(4c0ddd4faa0f7ccc4354ac2dd5ca65bd824cd643796ff36cdb6a3c7e71c5c7f0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:56 crc kubenswrapper[4763]: E0930 13:46:56.158749 4763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(4c0ddd4faa0f7ccc4354ac2dd5ca65bd824cd643796ff36cdb6a3c7e71c5c7f0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:46:56 crc kubenswrapper[4763]: E0930 13:46:56.158809 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-94rm8_crc-storage(4fae1a56-54ac-419f-8c7a-8230786d5188)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-94rm8_crc-storage(4fae1a56-54ac-419f-8c7a-8230786d5188)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(4c0ddd4faa0f7ccc4354ac2dd5ca65bd824cd643796ff36cdb6a3c7e71c5c7f0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-94rm8" podUID="4fae1a56-54ac-419f-8c7a-8230786d5188" Sep 30 13:46:56 crc kubenswrapper[4763]: I0930 13:46:56.171222 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" event={"ID":"1682b055-2e5d-4fea-b9ae-d12a24b9ff50","Type":"ContainerStarted","Data":"242c4db1a812101ea256ecb22980ce4f35fe44a8dd895706d110dedf56416b2b"} Sep 30 13:46:56 crc kubenswrapper[4763]: I0930 13:46:56.171576 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:56 crc kubenswrapper[4763]: I0930 13:46:56.171627 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:56 crc kubenswrapper[4763]: I0930 13:46:56.197701 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:56 crc kubenswrapper[4763]: I0930 13:46:56.205677 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" podStartSLOduration=7.205654189 podStartE2EDuration="7.205654189s" podCreationTimestamp="2025-09-30 13:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:46:56.203171115 +0000 UTC m=+688.341731410" watchObservedRunningTime="2025-09-30 13:46:56.205654189 +0000 UTC m=+688.344214474" Sep 30 13:46:57 crc kubenswrapper[4763]: I0930 13:46:57.180800 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:46:57 crc kubenswrapper[4763]: I0930 13:46:57.213384 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:47:03 crc kubenswrapper[4763]: I0930 13:47:03.489175 4763 scope.go:117] "RemoveContainer" containerID="5b43269fae80af1d4f4436c691aca5e5984ef49e50d5581da67884a6052cbef2" Sep 30 13:47:03 crc kubenswrapper[4763]: E0930 13:47:03.490346 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-c9qpw_openshift-multus(766e1024-d943-4721-a366-83bc3635cc79)\"" pod="openshift-multus/multus-c9qpw" podUID="766e1024-d943-4721-a366-83bc3635cc79" Sep 30 13:47:06 crc kubenswrapper[4763]: I0930 13:47:06.060571 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:47:06 crc kubenswrapper[4763]: I0930 13:47:06.060686 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:47:08 crc kubenswrapper[4763]: I0930 13:47:08.489569 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:47:08 crc kubenswrapper[4763]: I0930 13:47:08.494884 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:47:08 crc kubenswrapper[4763]: E0930 13:47:08.539669 4763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(c8c26f39f1599f43f62bc8333beb730006b4c62368039705efa267525c032b70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 13:47:08 crc kubenswrapper[4763]: E0930 13:47:08.539753 4763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(c8c26f39f1599f43f62bc8333beb730006b4c62368039705efa267525c032b70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:47:08 crc kubenswrapper[4763]: E0930 13:47:08.539803 4763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(c8c26f39f1599f43f62bc8333beb730006b4c62368039705efa267525c032b70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:47:08 crc kubenswrapper[4763]: E0930 13:47:08.539887 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-94rm8_crc-storage(4fae1a56-54ac-419f-8c7a-8230786d5188)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-94rm8_crc-storage(4fae1a56-54ac-419f-8c7a-8230786d5188)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-94rm8_crc-storage_4fae1a56-54ac-419f-8c7a-8230786d5188_0(c8c26f39f1599f43f62bc8333beb730006b4c62368039705efa267525c032b70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-94rm8" podUID="4fae1a56-54ac-419f-8c7a-8230786d5188" Sep 30 13:47:18 crc kubenswrapper[4763]: I0930 13:47:18.495559 4763 scope.go:117] "RemoveContainer" containerID="5b43269fae80af1d4f4436c691aca5e5984ef49e50d5581da67884a6052cbef2" Sep 30 13:47:19 crc kubenswrapper[4763]: I0930 13:47:19.323575 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c9qpw_766e1024-d943-4721-a366-83bc3635cc79/kube-multus/2.log" Sep 30 13:47:19 crc kubenswrapper[4763]: I0930 13:47:19.324044 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c9qpw" event={"ID":"766e1024-d943-4721-a366-83bc3635cc79","Type":"ContainerStarted","Data":"6ad8741f4c28e0fe5fa181b03a5671f4c4762498a1d964290caab455bb9ed1fd"} Sep 30 13:47:19 crc kubenswrapper[4763]: I0930 13:47:19.860564 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ft8fb" Sep 30 13:47:23 crc kubenswrapper[4763]: I0930 13:47:23.489033 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:47:23 crc kubenswrapper[4763]: I0930 13:47:23.490156 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:47:23 crc kubenswrapper[4763]: I0930 13:47:23.707512 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-94rm8"] Sep 30 13:47:23 crc kubenswrapper[4763]: I0930 13:47:23.715884 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:47:24 crc kubenswrapper[4763]: I0930 13:47:24.353494 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-94rm8" event={"ID":"4fae1a56-54ac-419f-8c7a-8230786d5188","Type":"ContainerStarted","Data":"bf945402f602e13949930ebd61b85133b369832b540457f01ae6377344dd5351"} Sep 30 13:47:26 crc kubenswrapper[4763]: I0930 13:47:26.368048 4763 generic.go:334] "Generic (PLEG): container finished" podID="4fae1a56-54ac-419f-8c7a-8230786d5188" containerID="90d2b0cf3fcdf2dc227081ea734b8b59d4df1002ff947c97d32d3c7ea42c767f" exitCode=0 Sep 30 13:47:26 crc kubenswrapper[4763]: I0930 13:47:26.368122 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-94rm8" event={"ID":"4fae1a56-54ac-419f-8c7a-8230786d5188","Type":"ContainerDied","Data":"90d2b0cf3fcdf2dc227081ea734b8b59d4df1002ff947c97d32d3c7ea42c767f"} Sep 30 13:47:27 crc kubenswrapper[4763]: I0930 13:47:27.588734 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:47:27 crc kubenswrapper[4763]: I0930 13:47:27.758199 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4fae1a56-54ac-419f-8c7a-8230786d5188-crc-storage\") pod \"4fae1a56-54ac-419f-8c7a-8230786d5188\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " Sep 30 13:47:27 crc kubenswrapper[4763]: I0930 13:47:27.758303 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4fae1a56-54ac-419f-8c7a-8230786d5188-node-mnt\") pod \"4fae1a56-54ac-419f-8c7a-8230786d5188\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " Sep 30 13:47:27 crc kubenswrapper[4763]: I0930 13:47:27.758352 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvvcf\" (UniqueName: \"kubernetes.io/projected/4fae1a56-54ac-419f-8c7a-8230786d5188-kube-api-access-lvvcf\") pod \"4fae1a56-54ac-419f-8c7a-8230786d5188\" (UID: \"4fae1a56-54ac-419f-8c7a-8230786d5188\") " Sep 30 13:47:27 crc kubenswrapper[4763]: I0930 13:47:27.758446 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae1a56-54ac-419f-8c7a-8230786d5188-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "4fae1a56-54ac-419f-8c7a-8230786d5188" (UID: "4fae1a56-54ac-419f-8c7a-8230786d5188"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:47:27 crc kubenswrapper[4763]: I0930 13:47:27.758593 4763 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4fae1a56-54ac-419f-8c7a-8230786d5188-node-mnt\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:27 crc kubenswrapper[4763]: I0930 13:47:27.764683 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fae1a56-54ac-419f-8c7a-8230786d5188-kube-api-access-lvvcf" (OuterVolumeSpecName: "kube-api-access-lvvcf") pod "4fae1a56-54ac-419f-8c7a-8230786d5188" (UID: "4fae1a56-54ac-419f-8c7a-8230786d5188"). InnerVolumeSpecName "kube-api-access-lvvcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:47:27 crc kubenswrapper[4763]: I0930 13:47:27.776378 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fae1a56-54ac-419f-8c7a-8230786d5188-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "4fae1a56-54ac-419f-8c7a-8230786d5188" (UID: "4fae1a56-54ac-419f-8c7a-8230786d5188"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:47:27 crc kubenswrapper[4763]: I0930 13:47:27.859592 4763 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4fae1a56-54ac-419f-8c7a-8230786d5188-crc-storage\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:27 crc kubenswrapper[4763]: I0930 13:47:27.859666 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvvcf\" (UniqueName: \"kubernetes.io/projected/4fae1a56-54ac-419f-8c7a-8230786d5188-kube-api-access-lvvcf\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:28 crc kubenswrapper[4763]: I0930 13:47:28.383790 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-94rm8" event={"ID":"4fae1a56-54ac-419f-8c7a-8230786d5188","Type":"ContainerDied","Data":"bf945402f602e13949930ebd61b85133b369832b540457f01ae6377344dd5351"} Sep 30 13:47:28 crc kubenswrapper[4763]: I0930 13:47:28.383855 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf945402f602e13949930ebd61b85133b369832b540457f01ae6377344dd5351" Sep 30 13:47:28 crc kubenswrapper[4763]: I0930 13:47:28.383876 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94rm8" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.264808 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87"] Sep 30 13:47:34 crc kubenswrapper[4763]: E0930 13:47:34.265483 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae1a56-54ac-419f-8c7a-8230786d5188" containerName="storage" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.265498 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae1a56-54ac-419f-8c7a-8230786d5188" containerName="storage" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.265627 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae1a56-54ac-419f-8c7a-8230786d5188" containerName="storage" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.266357 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.268533 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.273335 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87"] Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.446287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.446331 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.446365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm4sl\" (UniqueName: \"kubernetes.io/projected/bf82c0dd-1274-44d1-ac55-d1e2278de472-kube-api-access-nm4sl\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.547546 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.547636 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.547686 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm4sl\" (UniqueName: \"kubernetes.io/projected/bf82c0dd-1274-44d1-ac55-d1e2278de472-kube-api-access-nm4sl\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.548280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.548494 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.567498 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm4sl\" (UniqueName: \"kubernetes.io/projected/bf82c0dd-1274-44d1-ac55-d1e2278de472-kube-api-access-nm4sl\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.581307 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:34 crc kubenswrapper[4763]: I0930 13:47:34.774167 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87"] Sep 30 13:47:34 crc kubenswrapper[4763]: W0930 13:47:34.777968 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf82c0dd_1274_44d1_ac55_d1e2278de472.slice/crio-f04dde64d5b24447ab85b6e30ae59422ff39bbd2d44626caa4084794bfe014f0 WatchSource:0}: Error finding container f04dde64d5b24447ab85b6e30ae59422ff39bbd2d44626caa4084794bfe014f0: Status 404 returned error can't find the container with id f04dde64d5b24447ab85b6e30ae59422ff39bbd2d44626caa4084794bfe014f0 Sep 30 13:47:35 crc kubenswrapper[4763]: I0930 13:47:35.429550 4763 generic.go:334] "Generic (PLEG): container finished" podID="bf82c0dd-1274-44d1-ac55-d1e2278de472" containerID="349936ca8b4693194ee333fa53257a001f8c4ee50d6a85038d49805318285278" exitCode=0 Sep 30 13:47:35 crc kubenswrapper[4763]: I0930 13:47:35.429673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" event={"ID":"bf82c0dd-1274-44d1-ac55-d1e2278de472","Type":"ContainerDied","Data":"349936ca8b4693194ee333fa53257a001f8c4ee50d6a85038d49805318285278"} Sep 30 13:47:35 crc kubenswrapper[4763]: I0930 13:47:35.429882 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" event={"ID":"bf82c0dd-1274-44d1-ac55-d1e2278de472","Type":"ContainerStarted","Data":"f04dde64d5b24447ab85b6e30ae59422ff39bbd2d44626caa4084794bfe014f0"} Sep 30 13:47:36 crc kubenswrapper[4763]: I0930 13:47:36.059554 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:47:36 crc kubenswrapper[4763]: I0930 13:47:36.059654 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:47:38 crc kubenswrapper[4763]: I0930 13:47:38.446907 4763 generic.go:334] "Generic (PLEG): container finished" podID="bf82c0dd-1274-44d1-ac55-d1e2278de472" containerID="2c4980cd0fc0b4560b5937e064001246b099c67e79539d0c5f7e0e91a48f8bd5" exitCode=0 Sep 30 13:47:38 crc kubenswrapper[4763]: I0930 13:47:38.446957 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" event={"ID":"bf82c0dd-1274-44d1-ac55-d1e2278de472","Type":"ContainerDied","Data":"2c4980cd0fc0b4560b5937e064001246b099c67e79539d0c5f7e0e91a48f8bd5"} Sep 30 13:47:39 crc kubenswrapper[4763]: I0930 13:47:39.457516 4763 generic.go:334] "Generic (PLEG): container finished" podID="bf82c0dd-1274-44d1-ac55-d1e2278de472" containerID="816e4db240d86772a0589e597c1a8e3f23163c14845828e87e77ea80a60ce11e" exitCode=0 Sep 30 13:47:39 crc kubenswrapper[4763]: I0930 13:47:39.457869 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" event={"ID":"bf82c0dd-1274-44d1-ac55-d1e2278de472","Type":"ContainerDied","Data":"816e4db240d86772a0589e597c1a8e3f23163c14845828e87e77ea80a60ce11e"} Sep 30 13:47:40 crc kubenswrapper[4763]: I0930 13:47:40.683005 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:40 crc kubenswrapper[4763]: I0930 13:47:40.821839 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-bundle\") pod \"bf82c0dd-1274-44d1-ac55-d1e2278de472\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " Sep 30 13:47:40 crc kubenswrapper[4763]: I0930 13:47:40.821937 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm4sl\" (UniqueName: \"kubernetes.io/projected/bf82c0dd-1274-44d1-ac55-d1e2278de472-kube-api-access-nm4sl\") pod \"bf82c0dd-1274-44d1-ac55-d1e2278de472\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " Sep 30 13:47:40 crc kubenswrapper[4763]: I0930 13:47:40.822024 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-util\") pod \"bf82c0dd-1274-44d1-ac55-d1e2278de472\" (UID: \"bf82c0dd-1274-44d1-ac55-d1e2278de472\") " Sep 30 13:47:40 crc kubenswrapper[4763]: I0930 13:47:40.822693 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-bundle" (OuterVolumeSpecName: "bundle") pod "bf82c0dd-1274-44d1-ac55-d1e2278de472" (UID: "bf82c0dd-1274-44d1-ac55-d1e2278de472"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:47:40 crc kubenswrapper[4763]: I0930 13:47:40.830560 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf82c0dd-1274-44d1-ac55-d1e2278de472-kube-api-access-nm4sl" (OuterVolumeSpecName: "kube-api-access-nm4sl") pod "bf82c0dd-1274-44d1-ac55-d1e2278de472" (UID: "bf82c0dd-1274-44d1-ac55-d1e2278de472"). InnerVolumeSpecName "kube-api-access-nm4sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:47:40 crc kubenswrapper[4763]: I0930 13:47:40.835010 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-util" (OuterVolumeSpecName: "util") pod "bf82c0dd-1274-44d1-ac55-d1e2278de472" (UID: "bf82c0dd-1274-44d1-ac55-d1e2278de472"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:47:40 crc kubenswrapper[4763]: I0930 13:47:40.923633 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:40 crc kubenswrapper[4763]: I0930 13:47:40.923684 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm4sl\" (UniqueName: \"kubernetes.io/projected/bf82c0dd-1274-44d1-ac55-d1e2278de472-kube-api-access-nm4sl\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:40 crc kubenswrapper[4763]: I0930 13:47:40.923694 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf82c0dd-1274-44d1-ac55-d1e2278de472-util\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:41 crc kubenswrapper[4763]: I0930 13:47:41.470836 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" event={"ID":"bf82c0dd-1274-44d1-ac55-d1e2278de472","Type":"ContainerDied","Data":"f04dde64d5b24447ab85b6e30ae59422ff39bbd2d44626caa4084794bfe014f0"} Sep 30 13:47:41 crc kubenswrapper[4763]: I0930 13:47:41.470891 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f04dde64d5b24447ab85b6e30ae59422ff39bbd2d44626caa4084794bfe014f0" Sep 30 13:47:41 crc kubenswrapper[4763]: I0930 13:47:41.470938 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87" Sep 30 13:47:42 crc kubenswrapper[4763]: I0930 13:47:42.937924 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2"] Sep 30 13:47:42 crc kubenswrapper[4763]: E0930 13:47:42.938153 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf82c0dd-1274-44d1-ac55-d1e2278de472" containerName="pull" Sep 30 13:47:42 crc kubenswrapper[4763]: I0930 13:47:42.938169 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf82c0dd-1274-44d1-ac55-d1e2278de472" containerName="pull" Sep 30 13:47:42 crc kubenswrapper[4763]: E0930 13:47:42.938181 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf82c0dd-1274-44d1-ac55-d1e2278de472" containerName="extract" Sep 30 13:47:42 crc kubenswrapper[4763]: I0930 13:47:42.938188 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf82c0dd-1274-44d1-ac55-d1e2278de472" containerName="extract" Sep 30 13:47:42 crc kubenswrapper[4763]: E0930 13:47:42.938207 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf82c0dd-1274-44d1-ac55-d1e2278de472" containerName="util" Sep 30 13:47:42 crc kubenswrapper[4763]: I0930 13:47:42.938214 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf82c0dd-1274-44d1-ac55-d1e2278de472" containerName="util" Sep 30 13:47:42 crc kubenswrapper[4763]: I0930 13:47:42.938337 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf82c0dd-1274-44d1-ac55-d1e2278de472" containerName="extract" Sep 30 13:47:42 crc kubenswrapper[4763]: I0930 13:47:42.938770 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2" Sep 30 13:47:42 crc kubenswrapper[4763]: I0930 13:47:42.941085 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-n2kjz" Sep 30 13:47:42 crc kubenswrapper[4763]: I0930 13:47:42.941892 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 13:47:42 crc kubenswrapper[4763]: I0930 13:47:42.943394 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 13:47:42 crc kubenswrapper[4763]: I0930 13:47:42.985107 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2"] Sep 30 13:47:43 crc kubenswrapper[4763]: I0930 13:47:43.047790 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j2bh\" (UniqueName: \"kubernetes.io/projected/099f1320-84d1-45bd-a71b-36248dadb714-kube-api-access-6j2bh\") pod \"nmstate-operator-5d6f6cfd66-gdkg2\" (UID: \"099f1320-84d1-45bd-a71b-36248dadb714\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2" Sep 30 13:47:43 crc kubenswrapper[4763]: I0930 13:47:43.149476 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j2bh\" (UniqueName: \"kubernetes.io/projected/099f1320-84d1-45bd-a71b-36248dadb714-kube-api-access-6j2bh\") pod \"nmstate-operator-5d6f6cfd66-gdkg2\" (UID: \"099f1320-84d1-45bd-a71b-36248dadb714\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2" Sep 30 13:47:43 crc kubenswrapper[4763]: I0930 13:47:43.167643 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j2bh\" (UniqueName: \"kubernetes.io/projected/099f1320-84d1-45bd-a71b-36248dadb714-kube-api-access-6j2bh\") pod \"nmstate-operator-5d6f6cfd66-gdkg2\" (UID: \"099f1320-84d1-45bd-a71b-36248dadb714\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2" Sep 30 13:47:43 crc kubenswrapper[4763]: I0930 13:47:43.254588 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2" Sep 30 13:47:43 crc kubenswrapper[4763]: W0930 13:47:43.440632 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod099f1320_84d1_45bd_a71b_36248dadb714.slice/crio-159cfd8bf093b42321f9de0e2540bbef32fdcdb8bc35ac01f236f0510c643ae9 WatchSource:0}: Error finding container 159cfd8bf093b42321f9de0e2540bbef32fdcdb8bc35ac01f236f0510c643ae9: Status 404 returned error can't find the container with id 159cfd8bf093b42321f9de0e2540bbef32fdcdb8bc35ac01f236f0510c643ae9 Sep 30 13:47:43 crc kubenswrapper[4763]: I0930 13:47:43.440918 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2"] Sep 30 13:47:43 crc kubenswrapper[4763]: I0930 13:47:43.484779 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2" event={"ID":"099f1320-84d1-45bd-a71b-36248dadb714","Type":"ContainerStarted","Data":"159cfd8bf093b42321f9de0e2540bbef32fdcdb8bc35ac01f236f0510c643ae9"} Sep 30 13:47:54 crc kubenswrapper[4763]: I0930 13:47:54.562087 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2" event={"ID":"099f1320-84d1-45bd-a71b-36248dadb714","Type":"ContainerStarted","Data":"2bb19ea072ffda76f48bc6df739038577b965d46b905b75eada00d92aa4e81d6"} Sep 30 13:47:54 crc kubenswrapper[4763]: I0930 13:47:54.586264 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gdkg2" podStartSLOduration=2.191891996 podStartE2EDuration="12.586247174s" podCreationTimestamp="2025-09-30 13:47:42 +0000 UTC" firstStartedPulling="2025-09-30 13:47:43.443834337 +0000 UTC m=+735.582394622" lastFinishedPulling="2025-09-30 13:47:53.838189515 +0000 UTC m=+745.976749800" observedRunningTime="2025-09-30 13:47:54.585380382 +0000 UTC m=+746.723940707" watchObservedRunningTime="2025-09-30 13:47:54.586247174 +0000 UTC m=+746.724807469" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.480487 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk"] Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.481557 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.484063 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xcn5z" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.494305 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk"] Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.498856 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n"] Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.499640 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.501490 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.518083 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-d8g2c"] Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.519090 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.532085 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n"] Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.609890 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/535ccdf4-0560-4eb1-bfc6-8135453e4e11-dbus-socket\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.609939 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpsws\" (UniqueName: \"kubernetes.io/projected/535ccdf4-0560-4eb1-bfc6-8135453e4e11-kube-api-access-tpsws\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.609971 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj26c\" (UniqueName: \"kubernetes.io/projected/668f7b93-4e0d-4344-b856-1507f347c5a1-kube-api-access-jj26c\") pod \"nmstate-webhook-6d689559c5-dqq6n\" (UID: \"668f7b93-4e0d-4344-b856-1507f347c5a1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.610128 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcpqt\" (UniqueName: \"kubernetes.io/projected/8f52d94c-384b-4cbe-ac9d-aeffdb2769bb-kube-api-access-qcpqt\") pod \"nmstate-metrics-58fcddf996-7zvfk\" (UID: \"8f52d94c-384b-4cbe-ac9d-aeffdb2769bb\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.610228 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/535ccdf4-0560-4eb1-bfc6-8135453e4e11-ovs-socket\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.610345 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/668f7b93-4e0d-4344-b856-1507f347c5a1-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-dqq6n\" (UID: \"668f7b93-4e0d-4344-b856-1507f347c5a1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.610382 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/535ccdf4-0560-4eb1-bfc6-8135453e4e11-nmstate-lock\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.633133 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7"] Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.633928 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.636857 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-26jrr" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.636920 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.636853 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.644799 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7"] Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.711904 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcpqt\" (UniqueName: \"kubernetes.io/projected/8f52d94c-384b-4cbe-ac9d-aeffdb2769bb-kube-api-access-qcpqt\") pod \"nmstate-metrics-58fcddf996-7zvfk\" (UID: \"8f52d94c-384b-4cbe-ac9d-aeffdb2769bb\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.711949 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/535ccdf4-0560-4eb1-bfc6-8135453e4e11-ovs-socket\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.711991 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/668f7b93-4e0d-4344-b856-1507f347c5a1-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-dqq6n\" (UID: \"668f7b93-4e0d-4344-b856-1507f347c5a1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.712011 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/535ccdf4-0560-4eb1-bfc6-8135453e4e11-nmstate-lock\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.712043 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/535ccdf4-0560-4eb1-bfc6-8135453e4e11-dbus-socket\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.712060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpsws\" (UniqueName: \"kubernetes.io/projected/535ccdf4-0560-4eb1-bfc6-8135453e4e11-kube-api-access-tpsws\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.712078 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj26c\" (UniqueName: \"kubernetes.io/projected/668f7b93-4e0d-4344-b856-1507f347c5a1-kube-api-access-jj26c\") pod \"nmstate-webhook-6d689559c5-dqq6n\" (UID: \"668f7b93-4e0d-4344-b856-1507f347c5a1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.712276 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/535ccdf4-0560-4eb1-bfc6-8135453e4e11-nmstate-lock\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.712491 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/535ccdf4-0560-4eb1-bfc6-8135453e4e11-ovs-socket\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.712522 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/535ccdf4-0560-4eb1-bfc6-8135453e4e11-dbus-socket\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.717117 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/668f7b93-4e0d-4344-b856-1507f347c5a1-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-dqq6n\" (UID: \"668f7b93-4e0d-4344-b856-1507f347c5a1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.731263 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcpqt\" (UniqueName: \"kubernetes.io/projected/8f52d94c-384b-4cbe-ac9d-aeffdb2769bb-kube-api-access-qcpqt\") pod \"nmstate-metrics-58fcddf996-7zvfk\" (UID: \"8f52d94c-384b-4cbe-ac9d-aeffdb2769bb\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.735316 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj26c\" (UniqueName: \"kubernetes.io/projected/668f7b93-4e0d-4344-b856-1507f347c5a1-kube-api-access-jj26c\") pod \"nmstate-webhook-6d689559c5-dqq6n\" (UID: \"668f7b93-4e0d-4344-b856-1507f347c5a1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.744239 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpsws\" (UniqueName: \"kubernetes.io/projected/535ccdf4-0560-4eb1-bfc6-8135453e4e11-kube-api-access-tpsws\") pod \"nmstate-handler-d8g2c\" (UID: \"535ccdf4-0560-4eb1-bfc6-8135453e4e11\") " pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.802238 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.815829 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-85wg7\" (UID: \"04c0360a-87b1-434f-8d7b-9aadd2e5ab33\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.815892 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-85wg7\" (UID: \"04c0360a-87b1-434f-8d7b-9aadd2e5ab33\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.815926 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72rkf\" (UniqueName: \"kubernetes.io/projected/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-kube-api-access-72rkf\") pod \"nmstate-console-plugin-864bb6dfb5-85wg7\" (UID: \"04c0360a-87b1-434f-8d7b-9aadd2e5ab33\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.820464 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.837233 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.845133 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cc8bd7b4-2ldgk"] Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.846018 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.858938 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc8bd7b4-2ldgk"] Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.917224 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72rkf\" (UniqueName: \"kubernetes.io/projected/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-kube-api-access-72rkf\") pod \"nmstate-console-plugin-864bb6dfb5-85wg7\" (UID: \"04c0360a-87b1-434f-8d7b-9aadd2e5ab33\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.917354 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-85wg7\" (UID: \"04c0360a-87b1-434f-8d7b-9aadd2e5ab33\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.917399 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-85wg7\" (UID: \"04c0360a-87b1-434f-8d7b-9aadd2e5ab33\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:55 crc kubenswrapper[4763]: E0930 13:47:55.917564 4763 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Sep 30 13:47:55 crc kubenswrapper[4763]: E0930 13:47:55.917640 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-plugin-serving-cert podName:04c0360a-87b1-434f-8d7b-9aadd2e5ab33 nodeName:}" failed. No retries permitted until 2025-09-30 13:47:56.417621069 +0000 UTC m=+748.556181354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-85wg7" (UID: "04c0360a-87b1-434f-8d7b-9aadd2e5ab33") : secret "plugin-serving-cert" not found Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.918559 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-85wg7\" (UID: \"04c0360a-87b1-434f-8d7b-9aadd2e5ab33\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:55 crc kubenswrapper[4763]: I0930 13:47:55.943349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72rkf\" (UniqueName: \"kubernetes.io/projected/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-kube-api-access-72rkf\") pod \"nmstate-console-plugin-864bb6dfb5-85wg7\" (UID: \"04c0360a-87b1-434f-8d7b-9aadd2e5ab33\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.018041 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwdgp\" (UniqueName: \"kubernetes.io/projected/814e0314-10fe-4d71-ac5d-ca0eee482a00-kube-api-access-zwdgp\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.018372 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-console-config\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.018429 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-oauth-serving-cert\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.018447 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-service-ca\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.018469 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/814e0314-10fe-4d71-ac5d-ca0eee482a00-console-oauth-config\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.018493 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/814e0314-10fe-4d71-ac5d-ca0eee482a00-console-serving-cert\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.018522 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-trusted-ca-bundle\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.119466 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwdgp\" (UniqueName: \"kubernetes.io/projected/814e0314-10fe-4d71-ac5d-ca0eee482a00-kube-api-access-zwdgp\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.119517 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-console-config\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.119566 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-oauth-serving-cert\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.119583 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-service-ca\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.119632 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/814e0314-10fe-4d71-ac5d-ca0eee482a00-console-oauth-config\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.119669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/814e0314-10fe-4d71-ac5d-ca0eee482a00-console-serving-cert\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.119695 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-trusted-ca-bundle\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.120835 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-service-ca\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.120860 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-trusted-ca-bundle\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.121949 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-console-config\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.122207 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/814e0314-10fe-4d71-ac5d-ca0eee482a00-oauth-serving-cert\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.123488 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/814e0314-10fe-4d71-ac5d-ca0eee482a00-console-oauth-config\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.123765 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/814e0314-10fe-4d71-ac5d-ca0eee482a00-console-serving-cert\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.134589 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwdgp\" (UniqueName: \"kubernetes.io/projected/814e0314-10fe-4d71-ac5d-ca0eee482a00-kube-api-access-zwdgp\") pod \"console-7cc8bd7b4-2ldgk\" (UID: \"814e0314-10fe-4d71-ac5d-ca0eee482a00\") " pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.192930 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.357398 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk"] Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.361052 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc8bd7b4-2ldgk"] Sep 30 13:47:56 crc kubenswrapper[4763]: W0930 13:47:56.362686 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod814e0314_10fe_4d71_ac5d_ca0eee482a00.slice/crio-e3043cd780d38d588635bbb7ee6083c5a5ee9b72a9abf9b05c69afb160c19e28 WatchSource:0}: Error finding container e3043cd780d38d588635bbb7ee6083c5a5ee9b72a9abf9b05c69afb160c19e28: Status 404 returned error can't find the container with id e3043cd780d38d588635bbb7ee6083c5a5ee9b72a9abf9b05c69afb160c19e28 Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.363536 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n"] Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.423430 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-85wg7\" (UID: \"04c0360a-87b1-434f-8d7b-9aadd2e5ab33\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.427938 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04c0360a-87b1-434f-8d7b-9aadd2e5ab33-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-85wg7\" (UID: \"04c0360a-87b1-434f-8d7b-9aadd2e5ab33\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.548848 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.574694 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc8bd7b4-2ldgk" event={"ID":"814e0314-10fe-4d71-ac5d-ca0eee482a00","Type":"ContainerStarted","Data":"632bdf90c09aa9e960a4ed49bb9fff0418ed9f2e21a1a627593184251d6ff16c"} Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.574768 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc8bd7b4-2ldgk" event={"ID":"814e0314-10fe-4d71-ac5d-ca0eee482a00","Type":"ContainerStarted","Data":"e3043cd780d38d588635bbb7ee6083c5a5ee9b72a9abf9b05c69afb160c19e28"} Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.576636 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk" event={"ID":"8f52d94c-384b-4cbe-ac9d-aeffdb2769bb","Type":"ContainerStarted","Data":"67639e15f242c580b9249e226f63def8fc9c43ecd08adb3c6373f2b518ad14ee"} Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.577368 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-d8g2c" event={"ID":"535ccdf4-0560-4eb1-bfc6-8135453e4e11","Type":"ContainerStarted","Data":"9dd0e8addc5fb7a18a96da5e847dadb99e810292544540f6c624865bb2a6b1a8"} Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.578710 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" event={"ID":"668f7b93-4e0d-4344-b856-1507f347c5a1","Type":"ContainerStarted","Data":"87362038f5e8766ed7029d2505c8581e6cabcc6ab717702d18f14dd999abdfab"} Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.738480 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cc8bd7b4-2ldgk" podStartSLOduration=1.738463711 podStartE2EDuration="1.738463711s" podCreationTimestamp="2025-09-30 13:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:47:56.591039609 +0000 UTC m=+748.729599904" watchObservedRunningTime="2025-09-30 13:47:56.738463711 +0000 UTC m=+748.877023996" Sep 30 13:47:56 crc kubenswrapper[4763]: I0930 13:47:56.741198 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7"] Sep 30 13:47:56 crc kubenswrapper[4763]: W0930 13:47:56.752106 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04c0360a_87b1_434f_8d7b_9aadd2e5ab33.slice/crio-69d66ab257a5571ce3ccf46837f2efb01b65fa5fc43d9ac4976908f807240c17 WatchSource:0}: Error finding container 69d66ab257a5571ce3ccf46837f2efb01b65fa5fc43d9ac4976908f807240c17: Status 404 returned error can't find the container with id 69d66ab257a5571ce3ccf46837f2efb01b65fa5fc43d9ac4976908f807240c17 Sep 30 13:47:57 crc kubenswrapper[4763]: I0930 13:47:57.588189 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" event={"ID":"04c0360a-87b1-434f-8d7b-9aadd2e5ab33","Type":"ContainerStarted","Data":"69d66ab257a5571ce3ccf46837f2efb01b65fa5fc43d9ac4976908f807240c17"} Sep 30 13:47:59 crc kubenswrapper[4763]: I0930 13:47:59.605496 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" event={"ID":"04c0360a-87b1-434f-8d7b-9aadd2e5ab33","Type":"ContainerStarted","Data":"5e3d028e7acf425f82ba7bd55540be73b46d0fd68af3712656569a4287dce519"} Sep 30 13:47:59 crc kubenswrapper[4763]: I0930 13:47:59.608397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk" event={"ID":"8f52d94c-384b-4cbe-ac9d-aeffdb2769bb","Type":"ContainerStarted","Data":"c09454010e02596373689574aaf2071205ce95d9750a436bc8d29e4c737d4d9e"} Sep 30 13:47:59 crc kubenswrapper[4763]: I0930 13:47:59.609622 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-d8g2c" event={"ID":"535ccdf4-0560-4eb1-bfc6-8135453e4e11","Type":"ContainerStarted","Data":"fad984bd6433c8ab496c0062e43916c8c4f6a40393c2ef257bbea39909709d9c"} Sep 30 13:47:59 crc kubenswrapper[4763]: I0930 13:47:59.610019 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:47:59 crc kubenswrapper[4763]: I0930 13:47:59.611530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" event={"ID":"668f7b93-4e0d-4344-b856-1507f347c5a1","Type":"ContainerStarted","Data":"42cc0493135ebeb8c0475ee02ffb51c60e5367f372d0e82723b04235e965e58d"} Sep 30 13:47:59 crc kubenswrapper[4763]: I0930 13:47:59.611649 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" Sep 30 13:47:59 crc kubenswrapper[4763]: I0930 13:47:59.623505 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-85wg7" podStartSLOduration=2.278442547 podStartE2EDuration="4.623489287s" podCreationTimestamp="2025-09-30 13:47:55 +0000 UTC" firstStartedPulling="2025-09-30 13:47:56.754911632 +0000 UTC m=+748.893471907" lastFinishedPulling="2025-09-30 13:47:59.099958362 +0000 UTC m=+751.238518647" observedRunningTime="2025-09-30 13:47:59.621289401 +0000 UTC m=+751.759849686" watchObservedRunningTime="2025-09-30 13:47:59.623489287 +0000 UTC m=+751.762049572" Sep 30 13:47:59 crc kubenswrapper[4763]: I0930 13:47:59.636503 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" podStartSLOduration=1.90061737 podStartE2EDuration="4.6364863s" podCreationTimestamp="2025-09-30 13:47:55 +0000 UTC" firstStartedPulling="2025-09-30 13:47:56.36910213 +0000 UTC m=+748.507662425" lastFinishedPulling="2025-09-30 13:47:59.10497107 +0000 UTC m=+751.243531355" observedRunningTime="2025-09-30 13:47:59.633821151 +0000 UTC m=+751.772381456" watchObservedRunningTime="2025-09-30 13:47:59.6364863 +0000 UTC m=+751.775046585" Sep 30 13:47:59 crc kubenswrapper[4763]: I0930 13:47:59.653778 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-d8g2c" podStartSLOduration=1.517673561 podStartE2EDuration="4.653755491s" podCreationTimestamp="2025-09-30 13:47:55 +0000 UTC" firstStartedPulling="2025-09-30 13:47:55.964985261 +0000 UTC m=+748.103545546" lastFinishedPulling="2025-09-30 13:47:59.101067191 +0000 UTC m=+751.239627476" observedRunningTime="2025-09-30 13:47:59.650235382 +0000 UTC m=+751.788795667" watchObservedRunningTime="2025-09-30 13:47:59.653755491 +0000 UTC m=+751.792315776" Sep 30 13:48:03 crc kubenswrapper[4763]: I0930 13:48:03.634950 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk" event={"ID":"8f52d94c-384b-4cbe-ac9d-aeffdb2769bb","Type":"ContainerStarted","Data":"908f9178aeaa9c19ba19771d070d0a0a03bc6e5b726d87054fadc601fe0ce3a4"} Sep 30 13:48:03 crc kubenswrapper[4763]: I0930 13:48:03.652782 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-7zvfk" podStartSLOduration=1.9281231829999999 podStartE2EDuration="8.65275861s" podCreationTimestamp="2025-09-30 13:47:55 +0000 UTC" firstStartedPulling="2025-09-30 13:47:56.367269953 +0000 UTC m=+748.505830248" lastFinishedPulling="2025-09-30 13:48:03.09190539 +0000 UTC m=+755.230465675" observedRunningTime="2025-09-30 13:48:03.650167564 +0000 UTC m=+755.788727849" watchObservedRunningTime="2025-09-30 13:48:03.65275861 +0000 UTC m=+755.791318905" Sep 30 13:48:05 crc kubenswrapper[4763]: I0930 13:48:05.606886 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9rcjp"] Sep 30 13:48:05 crc kubenswrapper[4763]: I0930 13:48:05.607145 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" podUID="9eaed9c6-6995-4062-8c6b-a41853220149" containerName="controller-manager" containerID="cri-o://cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b" gracePeriod=30 Sep 30 13:48:05 crc kubenswrapper[4763]: I0930 13:48:05.725306 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv"] Sep 30 13:48:05 crc kubenswrapper[4763]: I0930 13:48:05.725502 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" podUID="9c7d4b69-3286-49c0-8a83-74bcccf25345" containerName="route-controller-manager" containerID="cri-o://53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6" gracePeriod=30 Sep 30 13:48:05 crc kubenswrapper[4763]: I0930 13:48:05.871084 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-d8g2c" Sep 30 13:48:05 crc kubenswrapper[4763]: I0930 13:48:05.969765 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.059763 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.059835 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.059889 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.060621 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f66002987a3e708ee53022f61f57bc4019ea893682ccce020d3b5027a63c2bf8"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.060693 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://f66002987a3e708ee53022f61f57bc4019ea893682ccce020d3b5027a63c2bf8" gracePeriod=600 Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.091018 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.159377 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-client-ca\") pod \"9eaed9c6-6995-4062-8c6b-a41853220149\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.159428 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-proxy-ca-bundles\") pod \"9eaed9c6-6995-4062-8c6b-a41853220149\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.159505 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9d2p\" (UniqueName: \"kubernetes.io/projected/9eaed9c6-6995-4062-8c6b-a41853220149-kube-api-access-f9d2p\") pod \"9eaed9c6-6995-4062-8c6b-a41853220149\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.159532 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9eaed9c6-6995-4062-8c6b-a41853220149-serving-cert\") pod \"9eaed9c6-6995-4062-8c6b-a41853220149\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.159559 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-config\") pod \"9eaed9c6-6995-4062-8c6b-a41853220149\" (UID: \"9eaed9c6-6995-4062-8c6b-a41853220149\") " Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.160150 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9eaed9c6-6995-4062-8c6b-a41853220149" (UID: "9eaed9c6-6995-4062-8c6b-a41853220149"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.160433 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-client-ca" (OuterVolumeSpecName: "client-ca") pod "9eaed9c6-6995-4062-8c6b-a41853220149" (UID: "9eaed9c6-6995-4062-8c6b-a41853220149"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.160450 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-config" (OuterVolumeSpecName: "config") pod "9eaed9c6-6995-4062-8c6b-a41853220149" (UID: "9eaed9c6-6995-4062-8c6b-a41853220149"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.165003 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaed9c6-6995-4062-8c6b-a41853220149-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9eaed9c6-6995-4062-8c6b-a41853220149" (UID: "9eaed9c6-6995-4062-8c6b-a41853220149"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.167045 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eaed9c6-6995-4062-8c6b-a41853220149-kube-api-access-f9d2p" (OuterVolumeSpecName: "kube-api-access-f9d2p") pod "9eaed9c6-6995-4062-8c6b-a41853220149" (UID: "9eaed9c6-6995-4062-8c6b-a41853220149"). InnerVolumeSpecName "kube-api-access-f9d2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.193415 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.193473 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.197491 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.260717 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7d4b69-3286-49c0-8a83-74bcccf25345-serving-cert\") pod \"9c7d4b69-3286-49c0-8a83-74bcccf25345\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.261045 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-client-ca\") pod \"9c7d4b69-3286-49c0-8a83-74bcccf25345\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.261100 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2f4l\" (UniqueName: \"kubernetes.io/projected/9c7d4b69-3286-49c0-8a83-74bcccf25345-kube-api-access-t2f4l\") pod \"9c7d4b69-3286-49c0-8a83-74bcccf25345\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.261217 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-config\") pod \"9c7d4b69-3286-49c0-8a83-74bcccf25345\" (UID: \"9c7d4b69-3286-49c0-8a83-74bcccf25345\") " Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.261433 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.261453 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.261467 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9d2p\" (UniqueName: \"kubernetes.io/projected/9eaed9c6-6995-4062-8c6b-a41853220149-kube-api-access-f9d2p\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.261477 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9eaed9c6-6995-4062-8c6b-a41853220149-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.261488 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9eaed9c6-6995-4062-8c6b-a41853220149-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.261618 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-client-ca" (OuterVolumeSpecName: "client-ca") pod "9c7d4b69-3286-49c0-8a83-74bcccf25345" (UID: "9c7d4b69-3286-49c0-8a83-74bcccf25345"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.261739 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-config" (OuterVolumeSpecName: "config") pod "9c7d4b69-3286-49c0-8a83-74bcccf25345" (UID: "9c7d4b69-3286-49c0-8a83-74bcccf25345"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.265182 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c7d4b69-3286-49c0-8a83-74bcccf25345-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9c7d4b69-3286-49c0-8a83-74bcccf25345" (UID: "9c7d4b69-3286-49c0-8a83-74bcccf25345"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.266720 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7d4b69-3286-49c0-8a83-74bcccf25345-kube-api-access-t2f4l" (OuterVolumeSpecName: "kube-api-access-t2f4l") pod "9c7d4b69-3286-49c0-8a83-74bcccf25345" (UID: "9c7d4b69-3286-49c0-8a83-74bcccf25345"). InnerVolumeSpecName "kube-api-access-t2f4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.362766 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.362798 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7d4b69-3286-49c0-8a83-74bcccf25345-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.362934 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c7d4b69-3286-49c0-8a83-74bcccf25345-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.363153 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2f4l\" (UniqueName: \"kubernetes.io/projected/9c7d4b69-3286-49c0-8a83-74bcccf25345-kube-api-access-t2f4l\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.652439 4763 generic.go:334] "Generic (PLEG): container finished" podID="9eaed9c6-6995-4062-8c6b-a41853220149" containerID="cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b" exitCode=0 Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.652545 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.652563 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" event={"ID":"9eaed9c6-6995-4062-8c6b-a41853220149","Type":"ContainerDied","Data":"cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b"} Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.652719 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9rcjp" event={"ID":"9eaed9c6-6995-4062-8c6b-a41853220149","Type":"ContainerDied","Data":"ad95d5e681e349a692a53b0315c14fd2214619cc565927e1b70d9382d6eb6438"} Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.652741 4763 scope.go:117] "RemoveContainer" containerID="cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.658956 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="f66002987a3e708ee53022f61f57bc4019ea893682ccce020d3b5027a63c2bf8" exitCode=0 Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.659045 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"f66002987a3e708ee53022f61f57bc4019ea893682ccce020d3b5027a63c2bf8"} Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.659082 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"2d22e10272d584e0d311db86eff7ac75db8f98341b6f7b1a40cf7584027c1ba8"} Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.661718 4763 generic.go:334] "Generic (PLEG): container finished" podID="9c7d4b69-3286-49c0-8a83-74bcccf25345" containerID="53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6" exitCode=0 Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.661923 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.662570 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" event={"ID":"9c7d4b69-3286-49c0-8a83-74bcccf25345","Type":"ContainerDied","Data":"53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6"} Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.662616 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv" event={"ID":"9c7d4b69-3286-49c0-8a83-74bcccf25345","Type":"ContainerDied","Data":"cfafc468c2a9234b54347475fcdc3b093ff2dcff98dfb0cd339e5711d3684228"} Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.666022 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cc8bd7b4-2ldgk" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.670854 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9rcjp"] Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.673934 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9rcjp"] Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.679649 4763 scope.go:117] "RemoveContainer" containerID="cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b" Sep 30 13:48:06 crc kubenswrapper[4763]: E0930 13:48:06.680146 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b\": container with ID starting with cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b not found: ID does not exist" containerID="cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.680186 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b"} err="failed to get container status \"cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b\": rpc error: code = NotFound desc = could not find container \"cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b\": container with ID starting with cce8490f3eee7f1b63ca9e19f9f9f365772ef6e5e7e10517169207880b54339b not found: ID does not exist" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.680216 4763 scope.go:117] "RemoveContainer" containerID="6a835f68fa095d0605d9b01f19066aa12d7ae1a68f6f7ff31a2cdf8fb87d2cb8" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.686311 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv"] Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.689484 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t2qjv"] Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.733300 4763 scope.go:117] "RemoveContainer" containerID="53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.752221 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-p5rvt"] Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.774093 4763 scope.go:117] "RemoveContainer" containerID="53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6" Sep 30 13:48:06 crc kubenswrapper[4763]: E0930 13:48:06.774866 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6\": container with ID starting with 53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6 not found: ID does not exist" containerID="53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.774902 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6"} err="failed to get container status \"53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6\": rpc error: code = NotFound desc = could not find container \"53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6\": container with ID starting with 53373bcce633984d7cb81824b35439946cdf7511625ef711db35c7f556177ee6 not found: ID does not exist" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.892230 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk"] Sep 30 13:48:06 crc kubenswrapper[4763]: E0930 13:48:06.893336 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7d4b69-3286-49c0-8a83-74bcccf25345" containerName="route-controller-manager" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.893368 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7d4b69-3286-49c0-8a83-74bcccf25345" containerName="route-controller-manager" Sep 30 13:48:06 crc kubenswrapper[4763]: E0930 13:48:06.893410 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eaed9c6-6995-4062-8c6b-a41853220149" containerName="controller-manager" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.893420 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eaed9c6-6995-4062-8c6b-a41853220149" containerName="controller-manager" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.894149 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eaed9c6-6995-4062-8c6b-a41853220149" containerName="controller-manager" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.894165 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c7d4b69-3286-49c0-8a83-74bcccf25345" containerName="route-controller-manager" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.894847 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.909371 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.912754 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-699dcd5df5-x9jf5"] Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.913712 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.914655 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.915297 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.915360 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.915628 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.915896 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.918377 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.918825 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.918830 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.918920 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.918998 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.921998 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.923891 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk"] Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.929275 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-699dcd5df5-x9jf5"] Sep 30 13:48:06 crc kubenswrapper[4763]: I0930 13:48:06.931109 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.076508 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98znn\" (UniqueName: \"kubernetes.io/projected/ec2ed484-c3e3-4434-9827-673e0ac568f8-kube-api-access-98znn\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.076634 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkxjz\" (UniqueName: \"kubernetes.io/projected/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-kube-api-access-qkxjz\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.076708 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec2ed484-c3e3-4434-9827-673e0ac568f8-config\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.076738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec2ed484-c3e3-4434-9827-673e0ac568f8-serving-cert\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.076759 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-client-ca\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.076812 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec2ed484-c3e3-4434-9827-673e0ac568f8-proxy-ca-bundles\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.076841 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-serving-cert\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.076870 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-config\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.077008 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec2ed484-c3e3-4434-9827-673e0ac568f8-client-ca\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.178033 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkxjz\" (UniqueName: \"kubernetes.io/projected/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-kube-api-access-qkxjz\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.178095 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec2ed484-c3e3-4434-9827-673e0ac568f8-config\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.178123 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec2ed484-c3e3-4434-9827-673e0ac568f8-serving-cert\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.178142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-client-ca\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.178168 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec2ed484-c3e3-4434-9827-673e0ac568f8-proxy-ca-bundles\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.178200 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-serving-cert\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.178234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-config\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.178273 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec2ed484-c3e3-4434-9827-673e0ac568f8-client-ca\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.178508 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98znn\" (UniqueName: \"kubernetes.io/projected/ec2ed484-c3e3-4434-9827-673e0ac568f8-kube-api-access-98znn\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.179228 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-client-ca\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.179567 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec2ed484-c3e3-4434-9827-673e0ac568f8-proxy-ca-bundles\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.179765 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec2ed484-c3e3-4434-9827-673e0ac568f8-config\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.179808 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec2ed484-c3e3-4434-9827-673e0ac568f8-client-ca\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.180081 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-config\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.182409 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec2ed484-c3e3-4434-9827-673e0ac568f8-serving-cert\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.182469 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-serving-cert\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.197860 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98znn\" (UniqueName: \"kubernetes.io/projected/ec2ed484-c3e3-4434-9827-673e0ac568f8-kube-api-access-98znn\") pod \"controller-manager-699dcd5df5-x9jf5\" (UID: \"ec2ed484-c3e3-4434-9827-673e0ac568f8\") " pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.206456 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkxjz\" (UniqueName: \"kubernetes.io/projected/8d12537f-9f28-43c2-acc5-dec5b9ac3cee-kube-api-access-qkxjz\") pod \"route-controller-manager-c7c6d88fc-ffjpk\" (UID: \"8d12537f-9f28-43c2-acc5-dec5b9ac3cee\") " pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.211869 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.256562 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.375857 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk"] Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.649058 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-699dcd5df5-x9jf5"] Sep 30 13:48:07 crc kubenswrapper[4763]: W0930 13:48:07.657249 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2ed484_c3e3_4434_9827_673e0ac568f8.slice/crio-1713de29577f87766edbdfaf22f2de0a95bfb1c38ac17b7512ed6d0363117690 WatchSource:0}: Error finding container 1713de29577f87766edbdfaf22f2de0a95bfb1c38ac17b7512ed6d0363117690: Status 404 returned error can't find the container with id 1713de29577f87766edbdfaf22f2de0a95bfb1c38ac17b7512ed6d0363117690 Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.667805 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" event={"ID":"ec2ed484-c3e3-4434-9827-673e0ac568f8","Type":"ContainerStarted","Data":"1713de29577f87766edbdfaf22f2de0a95bfb1c38ac17b7512ed6d0363117690"} Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.670171 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" event={"ID":"8d12537f-9f28-43c2-acc5-dec5b9ac3cee","Type":"ContainerStarted","Data":"729896aaa7817d05a577ad94484279a3d5ba8327b35870c7fc1ccf80b47350e5"} Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.670214 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" event={"ID":"8d12537f-9f28-43c2-acc5-dec5b9ac3cee","Type":"ContainerStarted","Data":"e4063a518908121bb4152e10e4176a66f9d062d8dbd0656c2a93ecbdfd6620c4"} Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.670457 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.808833 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" Sep 30 13:48:07 crc kubenswrapper[4763]: I0930 13:48:07.827234 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c7c6d88fc-ffjpk" podStartSLOduration=2.827215998 podStartE2EDuration="2.827215998s" podCreationTimestamp="2025-09-30 13:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:48:07.68817089 +0000 UTC m=+759.826745596" watchObservedRunningTime="2025-09-30 13:48:07.827215998 +0000 UTC m=+759.965776283" Sep 30 13:48:08 crc kubenswrapper[4763]: I0930 13:48:08.497798 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c7d4b69-3286-49c0-8a83-74bcccf25345" path="/var/lib/kubelet/pods/9c7d4b69-3286-49c0-8a83-74bcccf25345/volumes" Sep 30 13:48:08 crc kubenswrapper[4763]: I0930 13:48:08.498385 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eaed9c6-6995-4062-8c6b-a41853220149" path="/var/lib/kubelet/pods/9eaed9c6-6995-4062-8c6b-a41853220149/volumes" Sep 30 13:48:08 crc kubenswrapper[4763]: I0930 13:48:08.680294 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" event={"ID":"ec2ed484-c3e3-4434-9827-673e0ac568f8","Type":"ContainerStarted","Data":"4437f6827cc30e5c9a00990b41075d3dab7806463fa4c0cc1806a18d990c1804"} Sep 30 13:48:08 crc kubenswrapper[4763]: I0930 13:48:08.680534 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:08 crc kubenswrapper[4763]: I0930 13:48:08.688733 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" Sep 30 13:48:08 crc kubenswrapper[4763]: I0930 13:48:08.723795 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-699dcd5df5-x9jf5" podStartSLOduration=3.723770317 podStartE2EDuration="3.723770317s" podCreationTimestamp="2025-09-30 13:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:48:08.711539263 +0000 UTC m=+760.850099558" watchObservedRunningTime="2025-09-30 13:48:08.723770317 +0000 UTC m=+760.862330602" Sep 30 13:48:12 crc kubenswrapper[4763]: I0930 13:48:12.994426 4763 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 13:48:15 crc kubenswrapper[4763]: I0930 13:48:15.827569 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dqq6n" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.456385 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-48spc"] Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.458491 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.470936 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48spc"] Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.544948 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jhxq\" (UniqueName: \"kubernetes.io/projected/b8de6067-8735-482f-a5c8-b3103f6a8b39-kube-api-access-6jhxq\") pod \"redhat-marketplace-48spc\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.545045 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-utilities\") pod \"redhat-marketplace-48spc\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.545082 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-catalog-content\") pod \"redhat-marketplace-48spc\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.646350 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jhxq\" (UniqueName: \"kubernetes.io/projected/b8de6067-8735-482f-a5c8-b3103f6a8b39-kube-api-access-6jhxq\") pod \"redhat-marketplace-48spc\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.646446 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-utilities\") pod \"redhat-marketplace-48spc\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.646479 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-catalog-content\") pod \"redhat-marketplace-48spc\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.646926 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-catalog-content\") pod \"redhat-marketplace-48spc\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.647391 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-utilities\") pod \"redhat-marketplace-48spc\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.666981 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jhxq\" (UniqueName: \"kubernetes.io/projected/b8de6067-8735-482f-a5c8-b3103f6a8b39-kube-api-access-6jhxq\") pod \"redhat-marketplace-48spc\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:19 crc kubenswrapper[4763]: I0930 13:48:19.785098 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:20 crc kubenswrapper[4763]: I0930 13:48:20.223930 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48spc"] Sep 30 13:48:20 crc kubenswrapper[4763]: W0930 13:48:20.240338 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8de6067_8735_482f_a5c8_b3103f6a8b39.slice/crio-b01c126e37f73dcb4380c1044c33a08915644a66830ee7d87ea72fc9b65695fb WatchSource:0}: Error finding container b01c126e37f73dcb4380c1044c33a08915644a66830ee7d87ea72fc9b65695fb: Status 404 returned error can't find the container with id b01c126e37f73dcb4380c1044c33a08915644a66830ee7d87ea72fc9b65695fb Sep 30 13:48:20 crc kubenswrapper[4763]: I0930 13:48:20.753226 4763 generic.go:334] "Generic (PLEG): container finished" podID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerID="819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544" exitCode=0 Sep 30 13:48:20 crc kubenswrapper[4763]: I0930 13:48:20.753370 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48spc" event={"ID":"b8de6067-8735-482f-a5c8-b3103f6a8b39","Type":"ContainerDied","Data":"819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544"} Sep 30 13:48:20 crc kubenswrapper[4763]: I0930 13:48:20.753526 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48spc" event={"ID":"b8de6067-8735-482f-a5c8-b3103f6a8b39","Type":"ContainerStarted","Data":"b01c126e37f73dcb4380c1044c33a08915644a66830ee7d87ea72fc9b65695fb"} Sep 30 13:48:22 crc kubenswrapper[4763]: I0930 13:48:22.778964 4763 generic.go:334] "Generic (PLEG): container finished" podID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerID="0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0" exitCode=0 Sep 30 13:48:22 crc kubenswrapper[4763]: I0930 13:48:22.779127 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48spc" event={"ID":"b8de6067-8735-482f-a5c8-b3103f6a8b39","Type":"ContainerDied","Data":"0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0"} Sep 30 13:48:23 crc kubenswrapper[4763]: I0930 13:48:23.789153 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48spc" event={"ID":"b8de6067-8735-482f-a5c8-b3103f6a8b39","Type":"ContainerStarted","Data":"6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8"} Sep 30 13:48:23 crc kubenswrapper[4763]: I0930 13:48:23.807008 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-48spc" podStartSLOduration=2.260720096 podStartE2EDuration="4.806991265s" podCreationTimestamp="2025-09-30 13:48:19 +0000 UTC" firstStartedPulling="2025-09-30 13:48:20.75510953 +0000 UTC m=+772.893669825" lastFinishedPulling="2025-09-30 13:48:23.301380709 +0000 UTC m=+775.439940994" observedRunningTime="2025-09-30 13:48:23.804140922 +0000 UTC m=+775.942701207" watchObservedRunningTime="2025-09-30 13:48:23.806991265 +0000 UTC m=+775.945551550" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.625862 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h"] Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.627425 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.629821 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.642236 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h"] Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.671278 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.671667 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxjdj\" (UniqueName: \"kubernetes.io/projected/9410b485-e90e-4cb8-a924-d82596f1efee-kube-api-access-sxjdj\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.671832 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.772814 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.772962 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxjdj\" (UniqueName: \"kubernetes.io/projected/9410b485-e90e-4cb8-a924-d82596f1efee-kube-api-access-sxjdj\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.773027 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.773832 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.773849 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.785549 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.785580 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.799039 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxjdj\" (UniqueName: \"kubernetes.io/projected/9410b485-e90e-4cb8-a924-d82596f1efee-kube-api-access-sxjdj\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.829824 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.871308 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:29 crc kubenswrapper[4763]: I0930 13:48:29.943695 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:30 crc kubenswrapper[4763]: I0930 13:48:30.336586 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h"] Sep 30 13:48:30 crc kubenswrapper[4763]: W0930 13:48:30.340697 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9410b485_e90e_4cb8_a924_d82596f1efee.slice/crio-8c7a99b0f7278d9dc5832ec3192b6cddba0060592eb09dc869201977b4b4897f WatchSource:0}: Error finding container 8c7a99b0f7278d9dc5832ec3192b6cddba0060592eb09dc869201977b4b4897f: Status 404 returned error can't find the container with id 8c7a99b0f7278d9dc5832ec3192b6cddba0060592eb09dc869201977b4b4897f Sep 30 13:48:30 crc kubenswrapper[4763]: I0930 13:48:30.839794 4763 generic.go:334] "Generic (PLEG): container finished" podID="9410b485-e90e-4cb8-a924-d82596f1efee" containerID="3435c10edd404ae58cf985ffee3d3a8b54e6140f950430486e7227ec20eb4b64" exitCode=0 Sep 30 13:48:30 crc kubenswrapper[4763]: I0930 13:48:30.839868 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" event={"ID":"9410b485-e90e-4cb8-a924-d82596f1efee","Type":"ContainerDied","Data":"3435c10edd404ae58cf985ffee3d3a8b54e6140f950430486e7227ec20eb4b64"} Sep 30 13:48:30 crc kubenswrapper[4763]: I0930 13:48:30.840267 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" event={"ID":"9410b485-e90e-4cb8-a924-d82596f1efee","Type":"ContainerStarted","Data":"8c7a99b0f7278d9dc5832ec3192b6cddba0060592eb09dc869201977b4b4897f"} Sep 30 13:48:31 crc kubenswrapper[4763]: I0930 13:48:31.802909 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-p5rvt" podUID="cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" containerName="console" containerID="cri-o://062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604" gracePeriod=15 Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.267432 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-p5rvt_cd1f92d8-b30e-4e04-b40f-b72b9303ac4e/console/0.log" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.267784 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.334762 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-klqzm"] Sep 30 13:48:32 crc kubenswrapper[4763]: E0930 13:48:32.334988 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" containerName="console" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.335000 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" containerName="console" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.335120 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" containerName="console" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.335906 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.351680 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-klqzm"] Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.405040 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-serving-cert\") pod \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.405095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-service-ca\") pod \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.405140 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-config\") pod \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.405233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-oauth-config\") pod \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.405270 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-trusted-ca-bundle\") pod \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.405303 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgck\" (UniqueName: \"kubernetes.io/projected/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-kube-api-access-xcgck\") pod \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.405319 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-oauth-serving-cert\") pod \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\" (UID: \"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e\") " Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.405459 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhwlm\" (UniqueName: \"kubernetes.io/projected/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-kube-api-access-qhwlm\") pod \"redhat-operators-klqzm\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.405482 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-catalog-content\") pod \"redhat-operators-klqzm\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.405501 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-utilities\") pod \"redhat-operators-klqzm\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.406104 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-service-ca" (OuterVolumeSpecName: "service-ca") pod "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" (UID: "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.406439 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" (UID: "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.406522 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-config" (OuterVolumeSpecName: "console-config") pod "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" (UID: "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.406532 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" (UID: "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.411538 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" (UID: "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.411998 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" (UID: "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.413856 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-kube-api-access-xcgck" (OuterVolumeSpecName: "kube-api-access-xcgck") pod "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" (UID: "cd1f92d8-b30e-4e04-b40f-b72b9303ac4e"). InnerVolumeSpecName "kube-api-access-xcgck". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.506505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhwlm\" (UniqueName: \"kubernetes.io/projected/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-kube-api-access-qhwlm\") pod \"redhat-operators-klqzm\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.506555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-utilities\") pod \"redhat-operators-klqzm\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.506573 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-catalog-content\") pod \"redhat-operators-klqzm\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.506661 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.506673 4763 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.506685 4763 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.506693 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.506702 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgck\" (UniqueName: \"kubernetes.io/projected/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-kube-api-access-xcgck\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.506710 4763 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.506718 4763 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.507127 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-catalog-content\") pod \"redhat-operators-klqzm\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.507291 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-utilities\") pod \"redhat-operators-klqzm\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.525697 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhwlm\" (UniqueName: \"kubernetes.io/projected/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-kube-api-access-qhwlm\") pod \"redhat-operators-klqzm\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.660804 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.851527 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-p5rvt_cd1f92d8-b30e-4e04-b40f-b72b9303ac4e/console/0.log" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.851573 4763 generic.go:334] "Generic (PLEG): container finished" podID="cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" containerID="062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604" exitCode=2 Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.851622 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p5rvt" event={"ID":"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e","Type":"ContainerDied","Data":"062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604"} Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.851650 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p5rvt" event={"ID":"cd1f92d8-b30e-4e04-b40f-b72b9303ac4e","Type":"ContainerDied","Data":"21f93a369f3254b198e9bff025a86352cc7394e302c69e617c4c1c02fa9a46a4"} Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.851669 4763 scope.go:117] "RemoveContainer" containerID="062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.851675 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p5rvt" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.870418 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-p5rvt"] Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.874488 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-p5rvt"] Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.965650 4763 scope.go:117] "RemoveContainer" containerID="062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604" Sep 30 13:48:32 crc kubenswrapper[4763]: E0930 13:48:32.966297 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604\": container with ID starting with 062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604 not found: ID does not exist" containerID="062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604" Sep 30 13:48:32 crc kubenswrapper[4763]: I0930 13:48:32.966331 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604"} err="failed to get container status \"062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604\": rpc error: code = NotFound desc = could not find container \"062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604\": container with ID starting with 062cf0b4806e51e685e76d1918c94e7ee78b52d7467990e14f1f175d31042604 not found: ID does not exist" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.328828 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48spc"] Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.329492 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-48spc" podUID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerName="registry-server" containerID="cri-o://6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8" gracePeriod=2 Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.396577 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-klqzm"] Sep 30 13:48:33 crc kubenswrapper[4763]: W0930 13:48:33.478290 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode76dc5ff_8fd4_4f58_9ab6_0ebfb7a5917f.slice/crio-70d307659b11a971dc738464ec6ab3371062de7c7b663cf23d5eb1e037a372e8 WatchSource:0}: Error finding container 70d307659b11a971dc738464ec6ab3371062de7c7b663cf23d5eb1e037a372e8: Status 404 returned error can't find the container with id 70d307659b11a971dc738464ec6ab3371062de7c7b663cf23d5eb1e037a372e8 Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.828125 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.866475 4763 generic.go:334] "Generic (PLEG): container finished" podID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerID="28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c" exitCode=0 Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.866798 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klqzm" event={"ID":"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f","Type":"ContainerDied","Data":"28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c"} Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.866928 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klqzm" event={"ID":"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f","Type":"ContainerStarted","Data":"70d307659b11a971dc738464ec6ab3371062de7c7b663cf23d5eb1e037a372e8"} Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.871473 4763 generic.go:334] "Generic (PLEG): container finished" podID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerID="6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8" exitCode=0 Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.871551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48spc" event={"ID":"b8de6067-8735-482f-a5c8-b3103f6a8b39","Type":"ContainerDied","Data":"6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8"} Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.871586 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48spc" event={"ID":"b8de6067-8735-482f-a5c8-b3103f6a8b39","Type":"ContainerDied","Data":"b01c126e37f73dcb4380c1044c33a08915644a66830ee7d87ea72fc9b65695fb"} Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.871628 4763 scope.go:117] "RemoveContainer" containerID="6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.871709 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48spc" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.874409 4763 generic.go:334] "Generic (PLEG): container finished" podID="9410b485-e90e-4cb8-a924-d82596f1efee" containerID="655f8b310508251aa83cb479590deb333c6df32d742b69dd50e7458f39fb786e" exitCode=0 Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.874434 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" event={"ID":"9410b485-e90e-4cb8-a924-d82596f1efee","Type":"ContainerDied","Data":"655f8b310508251aa83cb479590deb333c6df32d742b69dd50e7458f39fb786e"} Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.890985 4763 scope.go:117] "RemoveContainer" containerID="0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.910173 4763 scope.go:117] "RemoveContainer" containerID="819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.924092 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-catalog-content\") pod \"b8de6067-8735-482f-a5c8-b3103f6a8b39\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.924193 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-utilities\") pod \"b8de6067-8735-482f-a5c8-b3103f6a8b39\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.924290 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jhxq\" (UniqueName: \"kubernetes.io/projected/b8de6067-8735-482f-a5c8-b3103f6a8b39-kube-api-access-6jhxq\") pod \"b8de6067-8735-482f-a5c8-b3103f6a8b39\" (UID: \"b8de6067-8735-482f-a5c8-b3103f6a8b39\") " Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.925831 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-utilities" (OuterVolumeSpecName: "utilities") pod "b8de6067-8735-482f-a5c8-b3103f6a8b39" (UID: "b8de6067-8735-482f-a5c8-b3103f6a8b39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.932128 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8de6067-8735-482f-a5c8-b3103f6a8b39-kube-api-access-6jhxq" (OuterVolumeSpecName: "kube-api-access-6jhxq") pod "b8de6067-8735-482f-a5c8-b3103f6a8b39" (UID: "b8de6067-8735-482f-a5c8-b3103f6a8b39"). InnerVolumeSpecName "kube-api-access-6jhxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.938956 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8de6067-8735-482f-a5c8-b3103f6a8b39" (UID: "b8de6067-8735-482f-a5c8-b3103f6a8b39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.955276 4763 scope.go:117] "RemoveContainer" containerID="6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8" Sep 30 13:48:33 crc kubenswrapper[4763]: E0930 13:48:33.955828 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8\": container with ID starting with 6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8 not found: ID does not exist" containerID="6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.955871 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8"} err="failed to get container status \"6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8\": rpc error: code = NotFound desc = could not find container \"6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8\": container with ID starting with 6b1bb8f3c89d6aa87ba4c30f27864659ffdd83381fcfd645d81c5077256db4a8 not found: ID does not exist" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.955896 4763 scope.go:117] "RemoveContainer" containerID="0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0" Sep 30 13:48:33 crc kubenswrapper[4763]: E0930 13:48:33.956207 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0\": container with ID starting with 0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0 not found: ID does not exist" containerID="0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.956248 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0"} err="failed to get container status \"0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0\": rpc error: code = NotFound desc = could not find container \"0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0\": container with ID starting with 0eb8bbed3f7150e70df08495cf20b84d65f25bc52da9acfc05a3f768b002f4e0 not found: ID does not exist" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.956276 4763 scope.go:117] "RemoveContainer" containerID="819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544" Sep 30 13:48:33 crc kubenswrapper[4763]: E0930 13:48:33.956544 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544\": container with ID starting with 819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544 not found: ID does not exist" containerID="819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544" Sep 30 13:48:33 crc kubenswrapper[4763]: I0930 13:48:33.956641 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544"} err="failed to get container status \"819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544\": rpc error: code = NotFound desc = could not find container \"819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544\": container with ID starting with 819afb311b5fcc630a1ae9dcf845e699e243d7ab202a5e4999b2656d7795e544 not found: ID does not exist" Sep 30 13:48:34 crc kubenswrapper[4763]: I0930 13:48:34.026165 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jhxq\" (UniqueName: \"kubernetes.io/projected/b8de6067-8735-482f-a5c8-b3103f6a8b39-kube-api-access-6jhxq\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:34 crc kubenswrapper[4763]: I0930 13:48:34.026198 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:34 crc kubenswrapper[4763]: I0930 13:48:34.026210 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8de6067-8735-482f-a5c8-b3103f6a8b39-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:34 crc kubenswrapper[4763]: I0930 13:48:34.253307 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48spc"] Sep 30 13:48:34 crc kubenswrapper[4763]: I0930 13:48:34.259193 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-48spc"] Sep 30 13:48:34 crc kubenswrapper[4763]: I0930 13:48:34.502928 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8de6067-8735-482f-a5c8-b3103f6a8b39" path="/var/lib/kubelet/pods/b8de6067-8735-482f-a5c8-b3103f6a8b39/volumes" Sep 30 13:48:34 crc kubenswrapper[4763]: I0930 13:48:34.504363 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd1f92d8-b30e-4e04-b40f-b72b9303ac4e" path="/var/lib/kubelet/pods/cd1f92d8-b30e-4e04-b40f-b72b9303ac4e/volumes" Sep 30 13:48:34 crc kubenswrapper[4763]: I0930 13:48:34.885140 4763 generic.go:334] "Generic (PLEG): container finished" podID="9410b485-e90e-4cb8-a924-d82596f1efee" containerID="7f803397e4c88d243abd271fa4add2cf3bf8cef5285691a30c46e16bb6ab84c3" exitCode=0 Sep 30 13:48:34 crc kubenswrapper[4763]: I0930 13:48:34.885230 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" event={"ID":"9410b485-e90e-4cb8-a924-d82596f1efee","Type":"ContainerDied","Data":"7f803397e4c88d243abd271fa4add2cf3bf8cef5285691a30c46e16bb6ab84c3"} Sep 30 13:48:35 crc kubenswrapper[4763]: I0930 13:48:35.894781 4763 generic.go:334] "Generic (PLEG): container finished" podID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerID="27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4" exitCode=0 Sep 30 13:48:35 crc kubenswrapper[4763]: I0930 13:48:35.894841 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klqzm" event={"ID":"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f","Type":"ContainerDied","Data":"27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4"} Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.332508 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.356361 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxjdj\" (UniqueName: \"kubernetes.io/projected/9410b485-e90e-4cb8-a924-d82596f1efee-kube-api-access-sxjdj\") pod \"9410b485-e90e-4cb8-a924-d82596f1efee\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.356464 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-util\") pod \"9410b485-e90e-4cb8-a924-d82596f1efee\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.356548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-bundle\") pod \"9410b485-e90e-4cb8-a924-d82596f1efee\" (UID: \"9410b485-e90e-4cb8-a924-d82596f1efee\") " Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.358018 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-bundle" (OuterVolumeSpecName: "bundle") pod "9410b485-e90e-4cb8-a924-d82596f1efee" (UID: "9410b485-e90e-4cb8-a924-d82596f1efee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.365728 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9410b485-e90e-4cb8-a924-d82596f1efee-kube-api-access-sxjdj" (OuterVolumeSpecName: "kube-api-access-sxjdj") pod "9410b485-e90e-4cb8-a924-d82596f1efee" (UID: "9410b485-e90e-4cb8-a924-d82596f1efee"). InnerVolumeSpecName "kube-api-access-sxjdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.376317 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-util" (OuterVolumeSpecName: "util") pod "9410b485-e90e-4cb8-a924-d82596f1efee" (UID: "9410b485-e90e-4cb8-a924-d82596f1efee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.458856 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxjdj\" (UniqueName: \"kubernetes.io/projected/9410b485-e90e-4cb8-a924-d82596f1efee-kube-api-access-sxjdj\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.459215 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-util\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.459244 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9410b485-e90e-4cb8-a924-d82596f1efee-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.907100 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.907100 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h" event={"ID":"9410b485-e90e-4cb8-a924-d82596f1efee","Type":"ContainerDied","Data":"8c7a99b0f7278d9dc5832ec3192b6cddba0060592eb09dc869201977b4b4897f"} Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.907428 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c7a99b0f7278d9dc5832ec3192b6cddba0060592eb09dc869201977b4b4897f" Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.912769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klqzm" event={"ID":"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f","Type":"ContainerStarted","Data":"220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42"} Sep 30 13:48:36 crc kubenswrapper[4763]: I0930 13:48:36.945737 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-klqzm" podStartSLOduration=2.246954951 podStartE2EDuration="4.945717569s" podCreationTimestamp="2025-09-30 13:48:32 +0000 UTC" firstStartedPulling="2025-09-30 13:48:33.868056328 +0000 UTC m=+786.006616613" lastFinishedPulling="2025-09-30 13:48:36.566818946 +0000 UTC m=+788.705379231" observedRunningTime="2025-09-30 13:48:36.93913022 +0000 UTC m=+789.077690575" watchObservedRunningTime="2025-09-30 13:48:36.945717569 +0000 UTC m=+789.084277864" Sep 30 13:48:42 crc kubenswrapper[4763]: I0930 13:48:42.661331 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:42 crc kubenswrapper[4763]: I0930 13:48:42.662004 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:42 crc kubenswrapper[4763]: I0930 13:48:42.700195 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:42 crc kubenswrapper[4763]: I0930 13:48:42.985945 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.528345 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-klqzm"] Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.529085 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-klqzm" podUID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerName="registry-server" containerID="cri-o://220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42" gracePeriod=2 Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.638482 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz"] Sep 30 13:48:45 crc kubenswrapper[4763]: E0930 13:48:45.643207 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerName="extract-utilities" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.643287 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerName="extract-utilities" Sep 30 13:48:45 crc kubenswrapper[4763]: E0930 13:48:45.643344 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9410b485-e90e-4cb8-a924-d82596f1efee" containerName="util" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.643400 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9410b485-e90e-4cb8-a924-d82596f1efee" containerName="util" Sep 30 13:48:45 crc kubenswrapper[4763]: E0930 13:48:45.643452 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9410b485-e90e-4cb8-a924-d82596f1efee" containerName="pull" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.643516 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9410b485-e90e-4cb8-a924-d82596f1efee" containerName="pull" Sep 30 13:48:45 crc kubenswrapper[4763]: E0930 13:48:45.643568 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerName="extract-content" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.643638 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerName="extract-content" Sep 30 13:48:45 crc kubenswrapper[4763]: E0930 13:48:45.643704 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerName="registry-server" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.643754 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerName="registry-server" Sep 30 13:48:45 crc kubenswrapper[4763]: E0930 13:48:45.643807 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9410b485-e90e-4cb8-a924-d82596f1efee" containerName="extract" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.643857 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9410b485-e90e-4cb8-a924-d82596f1efee" containerName="extract" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.643998 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9410b485-e90e-4cb8-a924-d82596f1efee" containerName="extract" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.644060 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8de6067-8735-482f-a5c8-b3103f6a8b39" containerName="registry-server" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.644494 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.646060 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.650050 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.650663 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.650785 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-mmvhq" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.651638 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.663711 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz"] Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.765592 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6-webhook-cert\") pod \"metallb-operator-controller-manager-68dc9cffd9-8prvz\" (UID: \"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6\") " pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.765685 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6-apiservice-cert\") pod \"metallb-operator-controller-manager-68dc9cffd9-8prvz\" (UID: \"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6\") " pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.765724 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndssg\" (UniqueName: \"kubernetes.io/projected/06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6-kube-api-access-ndssg\") pod \"metallb-operator-controller-manager-68dc9cffd9-8prvz\" (UID: \"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6\") " pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.867434 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6-webhook-cert\") pod \"metallb-operator-controller-manager-68dc9cffd9-8prvz\" (UID: \"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6\") " pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.867551 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6-apiservice-cert\") pod \"metallb-operator-controller-manager-68dc9cffd9-8prvz\" (UID: \"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6\") " pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.867575 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndssg\" (UniqueName: \"kubernetes.io/projected/06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6-kube-api-access-ndssg\") pod \"metallb-operator-controller-manager-68dc9cffd9-8prvz\" (UID: \"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6\") " pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.874200 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6-webhook-cert\") pod \"metallb-operator-controller-manager-68dc9cffd9-8prvz\" (UID: \"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6\") " pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.874200 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6-apiservice-cert\") pod \"metallb-operator-controller-manager-68dc9cffd9-8prvz\" (UID: \"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6\") " pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.885190 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndssg\" (UniqueName: \"kubernetes.io/projected/06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6-kube-api-access-ndssg\") pod \"metallb-operator-controller-manager-68dc9cffd9-8prvz\" (UID: \"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6\") " pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.962185 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq"] Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.963012 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.964797 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.965045 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-q7wfb" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.966205 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 13:48:45 crc kubenswrapper[4763]: I0930 13:48:45.973420 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.014176 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq"] Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.069997 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a15a621-d993-4cee-a58d-dbf5b4361ede-webhook-cert\") pod \"metallb-operator-webhook-server-94474c8f7-zxvpq\" (UID: \"7a15a621-d993-4cee-a58d-dbf5b4361ede\") " pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.070135 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kg7q\" (UniqueName: \"kubernetes.io/projected/7a15a621-d993-4cee-a58d-dbf5b4361ede-kube-api-access-9kg7q\") pod \"metallb-operator-webhook-server-94474c8f7-zxvpq\" (UID: \"7a15a621-d993-4cee-a58d-dbf5b4361ede\") " pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.070174 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a15a621-d993-4cee-a58d-dbf5b4361ede-apiservice-cert\") pod \"metallb-operator-webhook-server-94474c8f7-zxvpq\" (UID: \"7a15a621-d993-4cee-a58d-dbf5b4361ede\") " pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.171017 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kg7q\" (UniqueName: \"kubernetes.io/projected/7a15a621-d993-4cee-a58d-dbf5b4361ede-kube-api-access-9kg7q\") pod \"metallb-operator-webhook-server-94474c8f7-zxvpq\" (UID: \"7a15a621-d993-4cee-a58d-dbf5b4361ede\") " pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.171351 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a15a621-d993-4cee-a58d-dbf5b4361ede-apiservice-cert\") pod \"metallb-operator-webhook-server-94474c8f7-zxvpq\" (UID: \"7a15a621-d993-4cee-a58d-dbf5b4361ede\") " pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.171402 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a15a621-d993-4cee-a58d-dbf5b4361ede-webhook-cert\") pod \"metallb-operator-webhook-server-94474c8f7-zxvpq\" (UID: \"7a15a621-d993-4cee-a58d-dbf5b4361ede\") " pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.177318 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a15a621-d993-4cee-a58d-dbf5b4361ede-apiservice-cert\") pod \"metallb-operator-webhook-server-94474c8f7-zxvpq\" (UID: \"7a15a621-d993-4cee-a58d-dbf5b4361ede\") " pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.178340 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a15a621-d993-4cee-a58d-dbf5b4361ede-webhook-cert\") pod \"metallb-operator-webhook-server-94474c8f7-zxvpq\" (UID: \"7a15a621-d993-4cee-a58d-dbf5b4361ede\") " pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.185791 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kg7q\" (UniqueName: \"kubernetes.io/projected/7a15a621-d993-4cee-a58d-dbf5b4361ede-kube-api-access-9kg7q\") pod \"metallb-operator-webhook-server-94474c8f7-zxvpq\" (UID: \"7a15a621-d993-4cee-a58d-dbf5b4361ede\") " pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.279264 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.443582 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz"] Sep 30 13:48:46 crc kubenswrapper[4763]: W0930 13:48:46.770361 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a15a621_d993_4cee_a58d_dbf5b4361ede.slice/crio-82bbe881e6ccc30d2899be3116f1aef171e74d7c62dbca37af7a6dc01910ce77 WatchSource:0}: Error finding container 82bbe881e6ccc30d2899be3116f1aef171e74d7c62dbca37af7a6dc01910ce77: Status 404 returned error can't find the container with id 82bbe881e6ccc30d2899be3116f1aef171e74d7c62dbca37af7a6dc01910ce77 Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.773170 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq"] Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.971522 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" event={"ID":"7a15a621-d993-4cee-a58d-dbf5b4361ede","Type":"ContainerStarted","Data":"82bbe881e6ccc30d2899be3116f1aef171e74d7c62dbca37af7a6dc01910ce77"} Sep 30 13:48:46 crc kubenswrapper[4763]: I0930 13:48:46.972631 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" event={"ID":"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6","Type":"ContainerStarted","Data":"8f0e3d3b9f4e66097e2f63efdaad76d78884cba881a98e64828bb64453ace7d6"} Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.735262 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.892568 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhwlm\" (UniqueName: \"kubernetes.io/projected/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-kube-api-access-qhwlm\") pod \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.892640 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-catalog-content\") pod \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.892674 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-utilities\") pod \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\" (UID: \"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f\") " Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.893533 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-utilities" (OuterVolumeSpecName: "utilities") pod "e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" (UID: "e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.905508 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-kube-api-access-qhwlm" (OuterVolumeSpecName: "kube-api-access-qhwlm") pod "e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" (UID: "e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f"). InnerVolumeSpecName "kube-api-access-qhwlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.972690 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" (UID: "e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.979908 4763 generic.go:334] "Generic (PLEG): container finished" podID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerID="220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42" exitCode=0 Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.979958 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klqzm" event={"ID":"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f","Type":"ContainerDied","Data":"220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42"} Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.979966 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klqzm" Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.979989 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klqzm" event={"ID":"e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f","Type":"ContainerDied","Data":"70d307659b11a971dc738464ec6ab3371062de7c7b663cf23d5eb1e037a372e8"} Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.980011 4763 scope.go:117] "RemoveContainer" containerID="220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42" Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.993694 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhwlm\" (UniqueName: \"kubernetes.io/projected/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-kube-api-access-qhwlm\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.993724 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.993738 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:47 crc kubenswrapper[4763]: I0930 13:48:47.995092 4763 scope.go:117] "RemoveContainer" containerID="27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4" Sep 30 13:48:48 crc kubenswrapper[4763]: I0930 13:48:48.010283 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-klqzm"] Sep 30 13:48:48 crc kubenswrapper[4763]: I0930 13:48:48.013340 4763 scope.go:117] "RemoveContainer" containerID="28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c" Sep 30 13:48:48 crc kubenswrapper[4763]: I0930 13:48:48.016214 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-klqzm"] Sep 30 13:48:48 crc kubenswrapper[4763]: I0930 13:48:48.042178 4763 scope.go:117] "RemoveContainer" containerID="220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42" Sep 30 13:48:48 crc kubenswrapper[4763]: E0930 13:48:48.046068 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42\": container with ID starting with 220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42 not found: ID does not exist" containerID="220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42" Sep 30 13:48:48 crc kubenswrapper[4763]: I0930 13:48:48.046125 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42"} err="failed to get container status \"220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42\": rpc error: code = NotFound desc = could not find container \"220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42\": container with ID starting with 220a31ece76062286f59b1b7410f08ce4de2c2283933e01b8caaca718cfa2d42 not found: ID does not exist" Sep 30 13:48:48 crc kubenswrapper[4763]: I0930 13:48:48.046159 4763 scope.go:117] "RemoveContainer" containerID="27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4" Sep 30 13:48:48 crc kubenswrapper[4763]: E0930 13:48:48.046985 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4\": container with ID starting with 27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4 not found: ID does not exist" containerID="27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4" Sep 30 13:48:48 crc kubenswrapper[4763]: I0930 13:48:48.047220 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4"} err="failed to get container status \"27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4\": rpc error: code = NotFound desc = could not find container \"27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4\": container with ID starting with 27f50fc80f8c0ebbd19c7d8041355292a538ce90fa6725368c1ee89a056561b4 not found: ID does not exist" Sep 30 13:48:48 crc kubenswrapper[4763]: I0930 13:48:48.047318 4763 scope.go:117] "RemoveContainer" containerID="28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c" Sep 30 13:48:48 crc kubenswrapper[4763]: E0930 13:48:48.048005 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c\": container with ID starting with 28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c not found: ID does not exist" containerID="28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c" Sep 30 13:48:48 crc kubenswrapper[4763]: I0930 13:48:48.048045 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c"} err="failed to get container status \"28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c\": rpc error: code = NotFound desc = could not find container \"28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c\": container with ID starting with 28d187ae2b9168b36ec462d03fd078f5286cb6830c1a16beb7746237ac99f51c not found: ID does not exist" Sep 30 13:48:48 crc kubenswrapper[4763]: I0930 13:48:48.510193 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" path="/var/lib/kubelet/pods/e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f/volumes" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.026259 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" event={"ID":"7a15a621-d993-4cee-a58d-dbf5b4361ede","Type":"ContainerStarted","Data":"a16a1c3e5eb776f19c03ad558e836bea666175654e793e0489a296bf7ad7d1e9"} Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.026748 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.028728 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" event={"ID":"06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6","Type":"ContainerStarted","Data":"af449f7e7c3f280f118d279a078756520d023f55d61f67fa8a8599bc2ccd8561"} Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.028847 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.050497 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" podStartSLOduration=2.127632409 podStartE2EDuration="8.050474813s" podCreationTimestamp="2025-09-30 13:48:45 +0000 UTC" firstStartedPulling="2025-09-30 13:48:46.775461458 +0000 UTC m=+798.914021743" lastFinishedPulling="2025-09-30 13:48:52.698303852 +0000 UTC m=+804.836864147" observedRunningTime="2025-09-30 13:48:53.044645104 +0000 UTC m=+805.183205389" watchObservedRunningTime="2025-09-30 13:48:53.050474813 +0000 UTC m=+805.189035098" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.070909 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" podStartSLOduration=1.807966647 podStartE2EDuration="8.070891173s" podCreationTimestamp="2025-09-30 13:48:45 +0000 UTC" firstStartedPulling="2025-09-30 13:48:46.451793245 +0000 UTC m=+798.590353530" lastFinishedPulling="2025-09-30 13:48:52.714717771 +0000 UTC m=+804.853278056" observedRunningTime="2025-09-30 13:48:53.066783599 +0000 UTC m=+805.205343884" watchObservedRunningTime="2025-09-30 13:48:53.070891173 +0000 UTC m=+805.209451458" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.538387 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9ghql"] Sep 30 13:48:53 crc kubenswrapper[4763]: E0930 13:48:53.538635 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerName="extract-utilities" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.538651 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerName="extract-utilities" Sep 30 13:48:53 crc kubenswrapper[4763]: E0930 13:48:53.538669 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerName="extract-content" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.538675 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerName="extract-content" Sep 30 13:48:53 crc kubenswrapper[4763]: E0930 13:48:53.538686 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerName="registry-server" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.538693 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerName="registry-server" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.538812 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e76dc5ff-8fd4-4f58-9ab6-0ebfb7a5917f" containerName="registry-server" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.539610 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.554002 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9ghql"] Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.587678 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c4nz\" (UniqueName: \"kubernetes.io/projected/74794558-1ff9-480f-b978-25be82be45ef-kube-api-access-8c4nz\") pod \"community-operators-9ghql\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.588071 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-utilities\") pod \"community-operators-9ghql\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.588225 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-catalog-content\") pod \"community-operators-9ghql\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.689493 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c4nz\" (UniqueName: \"kubernetes.io/projected/74794558-1ff9-480f-b978-25be82be45ef-kube-api-access-8c4nz\") pod \"community-operators-9ghql\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.689572 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-utilities\") pod \"community-operators-9ghql\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.689592 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-catalog-content\") pod \"community-operators-9ghql\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.690036 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-catalog-content\") pod \"community-operators-9ghql\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.690232 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-utilities\") pod \"community-operators-9ghql\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.709072 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c4nz\" (UniqueName: \"kubernetes.io/projected/74794558-1ff9-480f-b978-25be82be45ef-kube-api-access-8c4nz\") pod \"community-operators-9ghql\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:53 crc kubenswrapper[4763]: I0930 13:48:53.852482 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:48:54 crc kubenswrapper[4763]: I0930 13:48:54.311556 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9ghql"] Sep 30 13:48:54 crc kubenswrapper[4763]: W0930 13:48:54.316047 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74794558_1ff9_480f_b978_25be82be45ef.slice/crio-bd9764cab7fa389847ba09e92772d16ae7a3d9f1576f796cc400e9b27cdb6b2a WatchSource:0}: Error finding container bd9764cab7fa389847ba09e92772d16ae7a3d9f1576f796cc400e9b27cdb6b2a: Status 404 returned error can't find the container with id bd9764cab7fa389847ba09e92772d16ae7a3d9f1576f796cc400e9b27cdb6b2a Sep 30 13:48:55 crc kubenswrapper[4763]: I0930 13:48:55.045210 4763 generic.go:334] "Generic (PLEG): container finished" podID="74794558-1ff9-480f-b978-25be82be45ef" containerID="75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db" exitCode=0 Sep 30 13:48:55 crc kubenswrapper[4763]: I0930 13:48:55.045263 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ghql" event={"ID":"74794558-1ff9-480f-b978-25be82be45ef","Type":"ContainerDied","Data":"75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db"} Sep 30 13:48:55 crc kubenswrapper[4763]: I0930 13:48:55.045287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ghql" event={"ID":"74794558-1ff9-480f-b978-25be82be45ef","Type":"ContainerStarted","Data":"bd9764cab7fa389847ba09e92772d16ae7a3d9f1576f796cc400e9b27cdb6b2a"} Sep 30 13:48:56 crc kubenswrapper[4763]: I0930 13:48:56.053895 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ghql" event={"ID":"74794558-1ff9-480f-b978-25be82be45ef","Type":"ContainerStarted","Data":"bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab"} Sep 30 13:48:57 crc kubenswrapper[4763]: I0930 13:48:57.059629 4763 generic.go:334] "Generic (PLEG): container finished" podID="74794558-1ff9-480f-b978-25be82be45ef" containerID="bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab" exitCode=0 Sep 30 13:48:57 crc kubenswrapper[4763]: I0930 13:48:57.059673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ghql" event={"ID":"74794558-1ff9-480f-b978-25be82be45ef","Type":"ContainerDied","Data":"bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab"} Sep 30 13:48:58 crc kubenswrapper[4763]: I0930 13:48:58.067695 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ghql" event={"ID":"74794558-1ff9-480f-b978-25be82be45ef","Type":"ContainerStarted","Data":"91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4"} Sep 30 13:48:58 crc kubenswrapper[4763]: I0930 13:48:58.087721 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9ghql" podStartSLOduration=2.630848651 podStartE2EDuration="5.087703152s" podCreationTimestamp="2025-09-30 13:48:53 +0000 UTC" firstStartedPulling="2025-09-30 13:48:55.04651169 +0000 UTC m=+807.185071975" lastFinishedPulling="2025-09-30 13:48:57.503366191 +0000 UTC m=+809.641926476" observedRunningTime="2025-09-30 13:48:58.084583702 +0000 UTC m=+810.223143987" watchObservedRunningTime="2025-09-30 13:48:58.087703152 +0000 UTC m=+810.226263447" Sep 30 13:49:03 crc kubenswrapper[4763]: I0930 13:49:03.853246 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:49:03 crc kubenswrapper[4763]: I0930 13:49:03.853942 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:49:03 crc kubenswrapper[4763]: I0930 13:49:03.903626 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:49:04 crc kubenswrapper[4763]: I0930 13:49:04.137774 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:49:04 crc kubenswrapper[4763]: I0930 13:49:04.732029 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9ghql"] Sep 30 13:49:06 crc kubenswrapper[4763]: I0930 13:49:06.111504 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9ghql" podUID="74794558-1ff9-480f-b978-25be82be45ef" containerName="registry-server" containerID="cri-o://91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4" gracePeriod=2 Sep 30 13:49:06 crc kubenswrapper[4763]: I0930 13:49:06.283891 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-94474c8f7-zxvpq" Sep 30 13:49:06 crc kubenswrapper[4763]: I0930 13:49:06.999268 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.060737 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-catalog-content\") pod \"74794558-1ff9-480f-b978-25be82be45ef\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.060843 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-utilities\") pod \"74794558-1ff9-480f-b978-25be82be45ef\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.060891 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c4nz\" (UniqueName: \"kubernetes.io/projected/74794558-1ff9-480f-b978-25be82be45ef-kube-api-access-8c4nz\") pod \"74794558-1ff9-480f-b978-25be82be45ef\" (UID: \"74794558-1ff9-480f-b978-25be82be45ef\") " Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.062864 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-utilities" (OuterVolumeSpecName: "utilities") pod "74794558-1ff9-480f-b978-25be82be45ef" (UID: "74794558-1ff9-480f-b978-25be82be45ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.066914 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74794558-1ff9-480f-b978-25be82be45ef-kube-api-access-8c4nz" (OuterVolumeSpecName: "kube-api-access-8c4nz") pod "74794558-1ff9-480f-b978-25be82be45ef" (UID: "74794558-1ff9-480f-b978-25be82be45ef"). InnerVolumeSpecName "kube-api-access-8c4nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.117812 4763 generic.go:334] "Generic (PLEG): container finished" podID="74794558-1ff9-480f-b978-25be82be45ef" containerID="91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4" exitCode=0 Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.117853 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ghql" event={"ID":"74794558-1ff9-480f-b978-25be82be45ef","Type":"ContainerDied","Data":"91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4"} Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.117915 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ghql" event={"ID":"74794558-1ff9-480f-b978-25be82be45ef","Type":"ContainerDied","Data":"bd9764cab7fa389847ba09e92772d16ae7a3d9f1576f796cc400e9b27cdb6b2a"} Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.117934 4763 scope.go:117] "RemoveContainer" containerID="91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.118787 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ghql" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.133470 4763 scope.go:117] "RemoveContainer" containerID="bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.148303 4763 scope.go:117] "RemoveContainer" containerID="75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.162620 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.162643 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c4nz\" (UniqueName: \"kubernetes.io/projected/74794558-1ff9-480f-b978-25be82be45ef-kube-api-access-8c4nz\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.162773 4763 scope.go:117] "RemoveContainer" containerID="91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4" Sep 30 13:49:07 crc kubenswrapper[4763]: E0930 13:49:07.163687 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4\": container with ID starting with 91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4 not found: ID does not exist" containerID="91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.163720 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4"} err="failed to get container status \"91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4\": rpc error: code = NotFound desc = could not find container \"91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4\": container with ID starting with 91c57277c6b220bbd4dc84fc14b37836338b30745f2c3fed36f7c9e5d711d0b4 not found: ID does not exist" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.163745 4763 scope.go:117] "RemoveContainer" containerID="bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab" Sep 30 13:49:07 crc kubenswrapper[4763]: E0930 13:49:07.164185 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab\": container with ID starting with bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab not found: ID does not exist" containerID="bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.164216 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab"} err="failed to get container status \"bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab\": rpc error: code = NotFound desc = could not find container \"bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab\": container with ID starting with bd23830c9436c976360adafd98c5aafe0574f8ee1e05f3ba1fd00c2e3b81b5ab not found: ID does not exist" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.164237 4763 scope.go:117] "RemoveContainer" containerID="75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db" Sep 30 13:49:07 crc kubenswrapper[4763]: E0930 13:49:07.164573 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db\": container with ID starting with 75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db not found: ID does not exist" containerID="75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.164728 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db"} err="failed to get container status \"75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db\": rpc error: code = NotFound desc = could not find container \"75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db\": container with ID starting with 75ef6735d4240859340efd43fbd0404333dbb1d198fcb7f02f4d8c90e725c8db not found: ID does not exist" Sep 30 13:49:07 crc kubenswrapper[4763]: I0930 13:49:07.985910 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74794558-1ff9-480f-b978-25be82be45ef" (UID: "74794558-1ff9-480f-b978-25be82be45ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:49:08 crc kubenswrapper[4763]: I0930 13:49:08.050576 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9ghql"] Sep 30 13:49:08 crc kubenswrapper[4763]: I0930 13:49:08.057716 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9ghql"] Sep 30 13:49:08 crc kubenswrapper[4763]: I0930 13:49:08.073633 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74794558-1ff9-480f-b978-25be82be45ef-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:08 crc kubenswrapper[4763]: I0930 13:49:08.497427 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74794558-1ff9-480f-b978-25be82be45ef" path="/var/lib/kubelet/pods/74794558-1ff9-480f-b978-25be82be45ef/volumes" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.630738 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hgpxs"] Sep 30 13:49:16 crc kubenswrapper[4763]: E0930 13:49:16.631374 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74794558-1ff9-480f-b978-25be82be45ef" containerName="registry-server" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.631386 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="74794558-1ff9-480f-b978-25be82be45ef" containerName="registry-server" Sep 30 13:49:16 crc kubenswrapper[4763]: E0930 13:49:16.631407 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74794558-1ff9-480f-b978-25be82be45ef" containerName="extract-content" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.631413 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="74794558-1ff9-480f-b978-25be82be45ef" containerName="extract-content" Sep 30 13:49:16 crc kubenswrapper[4763]: E0930 13:49:16.631425 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74794558-1ff9-480f-b978-25be82be45ef" containerName="extract-utilities" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.631432 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="74794558-1ff9-480f-b978-25be82be45ef" containerName="extract-utilities" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.631520 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="74794558-1ff9-480f-b978-25be82be45ef" containerName="registry-server" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.632253 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.649702 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hgpxs"] Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.779755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-catalog-content\") pod \"certified-operators-hgpxs\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.780128 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2zln\" (UniqueName: \"kubernetes.io/projected/79f725d9-bee0-4e1f-a2b5-856ea0546825-kube-api-access-z2zln\") pod \"certified-operators-hgpxs\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.780159 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-utilities\") pod \"certified-operators-hgpxs\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.881204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-catalog-content\") pod \"certified-operators-hgpxs\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.881267 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2zln\" (UniqueName: \"kubernetes.io/projected/79f725d9-bee0-4e1f-a2b5-856ea0546825-kube-api-access-z2zln\") pod \"certified-operators-hgpxs\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.881288 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-utilities\") pod \"certified-operators-hgpxs\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.881747 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-catalog-content\") pod \"certified-operators-hgpxs\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.881797 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-utilities\") pod \"certified-operators-hgpxs\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.900510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2zln\" (UniqueName: \"kubernetes.io/projected/79f725d9-bee0-4e1f-a2b5-856ea0546825-kube-api-access-z2zln\") pod \"certified-operators-hgpxs\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:16 crc kubenswrapper[4763]: I0930 13:49:16.952832 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:17 crc kubenswrapper[4763]: I0930 13:49:17.377586 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hgpxs"] Sep 30 13:49:18 crc kubenswrapper[4763]: I0930 13:49:18.175747 4763 generic.go:334] "Generic (PLEG): container finished" podID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerID="297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895" exitCode=0 Sep 30 13:49:18 crc kubenswrapper[4763]: I0930 13:49:18.175801 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgpxs" event={"ID":"79f725d9-bee0-4e1f-a2b5-856ea0546825","Type":"ContainerDied","Data":"297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895"} Sep 30 13:49:18 crc kubenswrapper[4763]: I0930 13:49:18.175832 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgpxs" event={"ID":"79f725d9-bee0-4e1f-a2b5-856ea0546825","Type":"ContainerStarted","Data":"4104c62ef89d920e0225b586bff64eeb113ccf253380d313503fa6c52de23981"} Sep 30 13:49:19 crc kubenswrapper[4763]: I0930 13:49:19.181469 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgpxs" event={"ID":"79f725d9-bee0-4e1f-a2b5-856ea0546825","Type":"ContainerStarted","Data":"1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651"} Sep 30 13:49:20 crc kubenswrapper[4763]: I0930 13:49:20.189662 4763 generic.go:334] "Generic (PLEG): container finished" podID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerID="1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651" exitCode=0 Sep 30 13:49:20 crc kubenswrapper[4763]: I0930 13:49:20.189709 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgpxs" event={"ID":"79f725d9-bee0-4e1f-a2b5-856ea0546825","Type":"ContainerDied","Data":"1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651"} Sep 30 13:49:21 crc kubenswrapper[4763]: I0930 13:49:21.198389 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgpxs" event={"ID":"79f725d9-bee0-4e1f-a2b5-856ea0546825","Type":"ContainerStarted","Data":"0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8"} Sep 30 13:49:25 crc kubenswrapper[4763]: I0930 13:49:25.977497 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-68dc9cffd9-8prvz" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.008383 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hgpxs" podStartSLOduration=7.362903685 podStartE2EDuration="10.008347384s" podCreationTimestamp="2025-09-30 13:49:16 +0000 UTC" firstStartedPulling="2025-09-30 13:49:18.177406284 +0000 UTC m=+830.315966579" lastFinishedPulling="2025-09-30 13:49:20.822849973 +0000 UTC m=+832.961410278" observedRunningTime="2025-09-30 13:49:21.218336348 +0000 UTC m=+833.356896633" watchObservedRunningTime="2025-09-30 13:49:26.008347384 +0000 UTC m=+838.146907659" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.688391 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8jngk"] Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.691097 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.695119 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.695617 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.701406 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-5prdm" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.706976 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-reloader\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.707058 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-metrics\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.707103 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r4v7\" (UniqueName: \"kubernetes.io/projected/d761c91a-ac32-4853-9045-0a8fb9df18c6-kube-api-access-4r4v7\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.707137 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d761c91a-ac32-4853-9045-0a8fb9df18c6-metrics-certs\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.707165 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-frr-conf\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.707387 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d761c91a-ac32-4853-9045-0a8fb9df18c6-frr-startup\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.707454 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-frr-sockets\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.727393 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m"] Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.728288 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.730477 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.772553 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m"] Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808218 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d761c91a-ac32-4853-9045-0a8fb9df18c6-frr-startup\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808278 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-frr-sockets\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9626782e-2d58-46e5-b064-7cc2fcb72381-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bd98m\" (UID: \"9626782e-2d58-46e5-b064-7cc2fcb72381\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808351 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-reloader\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808376 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-metrics\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808396 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r4v7\" (UniqueName: \"kubernetes.io/projected/d761c91a-ac32-4853-9045-0a8fb9df18c6-kube-api-access-4r4v7\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808429 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d761c91a-ac32-4853-9045-0a8fb9df18c6-metrics-certs\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808457 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-frr-conf\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808494 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2zl2\" (UniqueName: \"kubernetes.io/projected/9626782e-2d58-46e5-b064-7cc2fcb72381-kube-api-access-g2zl2\") pod \"frr-k8s-webhook-server-5478bdb765-bd98m\" (UID: \"9626782e-2d58-46e5-b064-7cc2fcb72381\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808856 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-frr-sockets\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.808963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-reloader\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.809245 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-frr-conf\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.809343 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d761c91a-ac32-4853-9045-0a8fb9df18c6-frr-startup\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.809360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d761c91a-ac32-4853-9045-0a8fb9df18c6-metrics\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.818084 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-c6gdx"] Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.819124 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-c6gdx" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.826109 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.826307 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-28hvd" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.826421 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.826452 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.827419 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d761c91a-ac32-4853-9045-0a8fb9df18c6-metrics-certs\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.872358 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r4v7\" (UniqueName: \"kubernetes.io/projected/d761c91a-ac32-4853-9045-0a8fb9df18c6-kube-api-access-4r4v7\") pod \"frr-k8s-8jngk\" (UID: \"d761c91a-ac32-4853-9045-0a8fb9df18c6\") " pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.909585 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2zl2\" (UniqueName: \"kubernetes.io/projected/9626782e-2d58-46e5-b064-7cc2fcb72381-kube-api-access-g2zl2\") pod \"frr-k8s-webhook-server-5478bdb765-bd98m\" (UID: \"9626782e-2d58-46e5-b064-7cc2fcb72381\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.909720 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9626782e-2d58-46e5-b064-7cc2fcb72381-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bd98m\" (UID: \"9626782e-2d58-46e5-b064-7cc2fcb72381\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.913913 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9626782e-2d58-46e5-b064-7cc2fcb72381-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bd98m\" (UID: \"9626782e-2d58-46e5-b064-7cc2fcb72381\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.922937 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-nzrdj"] Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.928253 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.941845 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.945178 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2zl2\" (UniqueName: \"kubernetes.io/projected/9626782e-2d58-46e5-b064-7cc2fcb72381-kube-api-access-g2zl2\") pod \"frr-k8s-webhook-server-5478bdb765-bd98m\" (UID: \"9626782e-2d58-46e5-b064-7cc2fcb72381\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.950185 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-nzrdj"] Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.961145 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:26 crc kubenswrapper[4763]: I0930 13:49:26.961189 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.010517 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-metrics-certs\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.010590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f7308df6-1fa3-4459-a108-5151e3b927fd-metallb-excludel2\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.010683 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-memberlist\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.010709 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhglr\" (UniqueName: \"kubernetes.io/projected/f7308df6-1fa3-4459-a108-5151e3b927fd-kube-api-access-mhglr\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.010863 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.011736 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.043457 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.113721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zngfb\" (UniqueName: \"kubernetes.io/projected/8f6f9f19-81cf-4593-8a84-7f1d771d4aa1-kube-api-access-zngfb\") pod \"controller-5d688f5ffc-nzrdj\" (UID: \"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1\") " pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.113795 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f6f9f19-81cf-4593-8a84-7f1d771d4aa1-cert\") pod \"controller-5d688f5ffc-nzrdj\" (UID: \"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1\") " pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.113836 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-metrics-certs\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.113865 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f7308df6-1fa3-4459-a108-5151e3b927fd-metallb-excludel2\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.113908 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-memberlist\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.113924 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhglr\" (UniqueName: \"kubernetes.io/projected/f7308df6-1fa3-4459-a108-5151e3b927fd-kube-api-access-mhglr\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.113950 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6f9f19-81cf-4593-8a84-7f1d771d4aa1-metrics-certs\") pod \"controller-5d688f5ffc-nzrdj\" (UID: \"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1\") " pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:27 crc kubenswrapper[4763]: E0930 13:49:27.114101 4763 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Sep 30 13:49:27 crc kubenswrapper[4763]: E0930 13:49:27.114151 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-metrics-certs podName:f7308df6-1fa3-4459-a108-5151e3b927fd nodeName:}" failed. No retries permitted until 2025-09-30 13:49:27.614132192 +0000 UTC m=+839.752692477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-metrics-certs") pod "speaker-c6gdx" (UID: "f7308df6-1fa3-4459-a108-5151e3b927fd") : secret "speaker-certs-secret" not found Sep 30 13:49:27 crc kubenswrapper[4763]: E0930 13:49:27.114520 4763 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 13:49:27 crc kubenswrapper[4763]: E0930 13:49:27.114705 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-memberlist podName:f7308df6-1fa3-4459-a108-5151e3b927fd nodeName:}" failed. No retries permitted until 2025-09-30 13:49:27.614678606 +0000 UTC m=+839.753238891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-memberlist") pod "speaker-c6gdx" (UID: "f7308df6-1fa3-4459-a108-5151e3b927fd") : secret "metallb-memberlist" not found Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.115721 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f7308df6-1fa3-4459-a108-5151e3b927fd-metallb-excludel2\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.139821 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhglr\" (UniqueName: \"kubernetes.io/projected/f7308df6-1fa3-4459-a108-5151e3b927fd-kube-api-access-mhglr\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.214732 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zngfb\" (UniqueName: \"kubernetes.io/projected/8f6f9f19-81cf-4593-8a84-7f1d771d4aa1-kube-api-access-zngfb\") pod \"controller-5d688f5ffc-nzrdj\" (UID: \"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1\") " pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.214796 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f6f9f19-81cf-4593-8a84-7f1d771d4aa1-cert\") pod \"controller-5d688f5ffc-nzrdj\" (UID: \"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1\") " pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.214927 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6f9f19-81cf-4593-8a84-7f1d771d4aa1-metrics-certs\") pod \"controller-5d688f5ffc-nzrdj\" (UID: \"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1\") " pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.219874 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f6f9f19-81cf-4593-8a84-7f1d771d4aa1-cert\") pod \"controller-5d688f5ffc-nzrdj\" (UID: \"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1\") " pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.228463 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6f9f19-81cf-4593-8a84-7f1d771d4aa1-metrics-certs\") pod \"controller-5d688f5ffc-nzrdj\" (UID: \"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1\") " pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.235790 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zngfb\" (UniqueName: \"kubernetes.io/projected/8f6f9f19-81cf-4593-8a84-7f1d771d4aa1-kube-api-access-zngfb\") pod \"controller-5d688f5ffc-nzrdj\" (UID: \"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1\") " pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.247651 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jngk" event={"ID":"d761c91a-ac32-4853-9045-0a8fb9df18c6","Type":"ContainerStarted","Data":"d962d587e88c8e9e5768fa4576a371faa41890dd469eda1971101f092d504000"} Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.281377 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.307005 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m"] Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.310944 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.373946 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hgpxs"] Sep 30 13:49:27 crc kubenswrapper[4763]: W0930 13:49:27.548469 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6f9f19_81cf_4593_8a84_7f1d771d4aa1.slice/crio-9b09c6622a12b5c13769e413965e4ee2f7f4c5cc03aa3c2dd5bc9be65964f5dc WatchSource:0}: Error finding container 9b09c6622a12b5c13769e413965e4ee2f7f4c5cc03aa3c2dd5bc9be65964f5dc: Status 404 returned error can't find the container with id 9b09c6622a12b5c13769e413965e4ee2f7f4c5cc03aa3c2dd5bc9be65964f5dc Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.551989 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-nzrdj"] Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.619696 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-metrics-certs\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.619792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-memberlist\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:27 crc kubenswrapper[4763]: E0930 13:49:27.619943 4763 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 13:49:27 crc kubenswrapper[4763]: E0930 13:49:27.620042 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-memberlist podName:f7308df6-1fa3-4459-a108-5151e3b927fd nodeName:}" failed. No retries permitted until 2025-09-30 13:49:28.620018822 +0000 UTC m=+840.758579107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-memberlist") pod "speaker-c6gdx" (UID: "f7308df6-1fa3-4459-a108-5151e3b927fd") : secret "metallb-memberlist" not found Sep 30 13:49:27 crc kubenswrapper[4763]: I0930 13:49:27.626784 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-metrics-certs\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:28 crc kubenswrapper[4763]: I0930 13:49:28.254089 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" event={"ID":"9626782e-2d58-46e5-b064-7cc2fcb72381","Type":"ContainerStarted","Data":"342fd0806378ed41a76444dbfbb8ba04cd9881ac425ab68393d38080d6cc08ad"} Sep 30 13:49:28 crc kubenswrapper[4763]: I0930 13:49:28.255564 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-nzrdj" event={"ID":"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1","Type":"ContainerStarted","Data":"7d6b7950228ec8e094a177032280b17188f54e7fb88ad3acf818bff614960712"} Sep 30 13:49:28 crc kubenswrapper[4763]: I0930 13:49:28.255660 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-nzrdj" event={"ID":"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1","Type":"ContainerStarted","Data":"9b83c44ef78ae558723ec14657ed99193cc1bd8f9d4890e9267cef43b0ee73c9"} Sep 30 13:49:28 crc kubenswrapper[4763]: I0930 13:49:28.255679 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-nzrdj" event={"ID":"8f6f9f19-81cf-4593-8a84-7f1d771d4aa1","Type":"ContainerStarted","Data":"9b09c6622a12b5c13769e413965e4ee2f7f4c5cc03aa3c2dd5bc9be65964f5dc"} Sep 30 13:49:28 crc kubenswrapper[4763]: I0930 13:49:28.509052 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-nzrdj" podStartSLOduration=2.509030942 podStartE2EDuration="2.509030942s" podCreationTimestamp="2025-09-30 13:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:49:28.272707496 +0000 UTC m=+840.411267781" watchObservedRunningTime="2025-09-30 13:49:28.509030942 +0000 UTC m=+840.647591227" Sep 30 13:49:28 crc kubenswrapper[4763]: I0930 13:49:28.631863 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-memberlist\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:28 crc kubenswrapper[4763]: I0930 13:49:28.636342 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f7308df6-1fa3-4459-a108-5151e3b927fd-memberlist\") pod \"speaker-c6gdx\" (UID: \"f7308df6-1fa3-4459-a108-5151e3b927fd\") " pod="metallb-system/speaker-c6gdx" Sep 30 13:49:28 crc kubenswrapper[4763]: I0930 13:49:28.708651 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-28hvd" Sep 30 13:49:28 crc kubenswrapper[4763]: I0930 13:49:28.716871 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-c6gdx" Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.268771 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-c6gdx" event={"ID":"f7308df6-1fa3-4459-a108-5151e3b927fd","Type":"ContainerStarted","Data":"ff8efde4c0571ef640568bbb1989fcdec3eb444a16e3b886b243f42261e97325"} Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.269112 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-c6gdx" event={"ID":"f7308df6-1fa3-4459-a108-5151e3b927fd","Type":"ContainerStarted","Data":"bbdb2287c3e1b659b9d362e6b8ba17d2b5e99b87febbcab0452e3e60fb1dc303"} Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.269135 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.269167 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hgpxs" podUID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerName="registry-server" containerID="cri-o://0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8" gracePeriod=2 Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.777610 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.847760 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-utilities\") pod \"79f725d9-bee0-4e1f-a2b5-856ea0546825\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.847917 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2zln\" (UniqueName: \"kubernetes.io/projected/79f725d9-bee0-4e1f-a2b5-856ea0546825-kube-api-access-z2zln\") pod \"79f725d9-bee0-4e1f-a2b5-856ea0546825\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.848121 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-catalog-content\") pod \"79f725d9-bee0-4e1f-a2b5-856ea0546825\" (UID: \"79f725d9-bee0-4e1f-a2b5-856ea0546825\") " Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.849401 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-utilities" (OuterVolumeSpecName: "utilities") pod "79f725d9-bee0-4e1f-a2b5-856ea0546825" (UID: "79f725d9-bee0-4e1f-a2b5-856ea0546825"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.866537 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f725d9-bee0-4e1f-a2b5-856ea0546825-kube-api-access-z2zln" (OuterVolumeSpecName: "kube-api-access-z2zln") pod "79f725d9-bee0-4e1f-a2b5-856ea0546825" (UID: "79f725d9-bee0-4e1f-a2b5-856ea0546825"). InnerVolumeSpecName "kube-api-access-z2zln". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.949625 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2zln\" (UniqueName: \"kubernetes.io/projected/79f725d9-bee0-4e1f-a2b5-856ea0546825-kube-api-access-z2zln\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:29 crc kubenswrapper[4763]: I0930 13:49:29.949673 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.279017 4763 generic.go:334] "Generic (PLEG): container finished" podID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerID="0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8" exitCode=0 Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.279078 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgpxs" event={"ID":"79f725d9-bee0-4e1f-a2b5-856ea0546825","Type":"ContainerDied","Data":"0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8"} Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.279441 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgpxs" event={"ID":"79f725d9-bee0-4e1f-a2b5-856ea0546825","Type":"ContainerDied","Data":"4104c62ef89d920e0225b586bff64eeb113ccf253380d313503fa6c52de23981"} Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.279463 4763 scope.go:117] "RemoveContainer" containerID="0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.279132 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgpxs" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.282699 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-c6gdx" event={"ID":"f7308df6-1fa3-4459-a108-5151e3b927fd","Type":"ContainerStarted","Data":"826749431d2e1739fab54cbacfee5811716db9cc15d8004a6cc9e5df56702c82"} Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.325578 4763 scope.go:117] "RemoveContainer" containerID="1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.326158 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-c6gdx" podStartSLOduration=4.326136329 podStartE2EDuration="4.326136329s" podCreationTimestamp="2025-09-30 13:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:49:30.321677444 +0000 UTC m=+842.460237749" watchObservedRunningTime="2025-09-30 13:49:30.326136329 +0000 UTC m=+842.464696624" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.350976 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79f725d9-bee0-4e1f-a2b5-856ea0546825" (UID: "79f725d9-bee0-4e1f-a2b5-856ea0546825"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.361371 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f725d9-bee0-4e1f-a2b5-856ea0546825-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.380771 4763 scope.go:117] "RemoveContainer" containerID="297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.424917 4763 scope.go:117] "RemoveContainer" containerID="0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8" Sep 30 13:49:30 crc kubenswrapper[4763]: E0930 13:49:30.427373 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8\": container with ID starting with 0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8 not found: ID does not exist" containerID="0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.427431 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8"} err="failed to get container status \"0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8\": rpc error: code = NotFound desc = could not find container \"0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8\": container with ID starting with 0f0c0c5dfbe0b8243e00fc0f1e0e8a385b7a466a21d80f6da0325df1015d9ab8 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.427464 4763 scope.go:117] "RemoveContainer" containerID="1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651" Sep 30 13:49:30 crc kubenswrapper[4763]: E0930 13:49:30.428039 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651\": container with ID starting with 1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651 not found: ID does not exist" containerID="1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.428083 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651"} err="failed to get container status \"1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651\": rpc error: code = NotFound desc = could not find container \"1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651\": container with ID starting with 1002e5ca8c009749db9e862b38731c79accc1063d6f5ab0801a7fb229cad9651 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.428114 4763 scope.go:117] "RemoveContainer" containerID="297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895" Sep 30 13:49:30 crc kubenswrapper[4763]: E0930 13:49:30.428500 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895\": container with ID starting with 297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895 not found: ID does not exist" containerID="297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.428560 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895"} err="failed to get container status \"297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895\": rpc error: code = NotFound desc = could not find container \"297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895\": container with ID starting with 297a97c0aee6b6d6f8da2173ca53417c7d15030749e3452d8864eac569883895 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.598895 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hgpxs"] Sep 30 13:49:30 crc kubenswrapper[4763]: I0930 13:49:30.601949 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hgpxs"] Sep 30 13:49:31 crc kubenswrapper[4763]: I0930 13:49:31.289390 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-c6gdx" Sep 30 13:49:32 crc kubenswrapper[4763]: I0930 13:49:32.501770 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f725d9-bee0-4e1f-a2b5-856ea0546825" path="/var/lib/kubelet/pods/79f725d9-bee0-4e1f-a2b5-856ea0546825/volumes" Sep 30 13:49:35 crc kubenswrapper[4763]: I0930 13:49:35.312626 4763 generic.go:334] "Generic (PLEG): container finished" podID="d761c91a-ac32-4853-9045-0a8fb9df18c6" containerID="41c457847b50524258bc41a2708d97b5b07ac90204e52bd31b4997f13d6bdc69" exitCode=0 Sep 30 13:49:35 crc kubenswrapper[4763]: I0930 13:49:35.313791 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jngk" event={"ID":"d761c91a-ac32-4853-9045-0a8fb9df18c6","Type":"ContainerDied","Data":"41c457847b50524258bc41a2708d97b5b07ac90204e52bd31b4997f13d6bdc69"} Sep 30 13:49:35 crc kubenswrapper[4763]: I0930 13:49:35.318026 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" event={"ID":"9626782e-2d58-46e5-b064-7cc2fcb72381","Type":"ContainerStarted","Data":"32418d9c723095bbfedc307b3d0989c39674e59c19598d004d4fd2c20068db88"} Sep 30 13:49:35 crc kubenswrapper[4763]: I0930 13:49:35.318438 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" Sep 30 13:49:35 crc kubenswrapper[4763]: I0930 13:49:35.362388 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" podStartSLOduration=1.697661659 podStartE2EDuration="9.362368412s" podCreationTimestamp="2025-09-30 13:49:26 +0000 UTC" firstStartedPulling="2025-09-30 13:49:27.327473982 +0000 UTC m=+839.466034267" lastFinishedPulling="2025-09-30 13:49:34.992180735 +0000 UTC m=+847.130741020" observedRunningTime="2025-09-30 13:49:35.35873337 +0000 UTC m=+847.497293655" watchObservedRunningTime="2025-09-30 13:49:35.362368412 +0000 UTC m=+847.500928697" Sep 30 13:49:36 crc kubenswrapper[4763]: I0930 13:49:36.326434 4763 generic.go:334] "Generic (PLEG): container finished" podID="d761c91a-ac32-4853-9045-0a8fb9df18c6" containerID="917a0122eb396c673e26d31b3575e78ba8f53e0ffae96bdc2fb2e728c9cfde82" exitCode=0 Sep 30 13:49:36 crc kubenswrapper[4763]: I0930 13:49:36.326533 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jngk" event={"ID":"d761c91a-ac32-4853-9045-0a8fb9df18c6","Type":"ContainerDied","Data":"917a0122eb396c673e26d31b3575e78ba8f53e0ffae96bdc2fb2e728c9cfde82"} Sep 30 13:49:37 crc kubenswrapper[4763]: I0930 13:49:37.286159 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-nzrdj" Sep 30 13:49:37 crc kubenswrapper[4763]: I0930 13:49:37.340786 4763 generic.go:334] "Generic (PLEG): container finished" podID="d761c91a-ac32-4853-9045-0a8fb9df18c6" containerID="abc83e0b1527aa8fb47d7e040161469bea2a0d2fa4134d7c438a19012dac83d9" exitCode=0 Sep 30 13:49:37 crc kubenswrapper[4763]: I0930 13:49:37.340855 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jngk" event={"ID":"d761c91a-ac32-4853-9045-0a8fb9df18c6","Type":"ContainerDied","Data":"abc83e0b1527aa8fb47d7e040161469bea2a0d2fa4134d7c438a19012dac83d9"} Sep 30 13:49:38 crc kubenswrapper[4763]: I0930 13:49:38.352347 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jngk" event={"ID":"d761c91a-ac32-4853-9045-0a8fb9df18c6","Type":"ContainerStarted","Data":"e471664228afd5763f71b7aa9eb1664eaffaaed5d5f281d7b15668ab2a1e7fa1"} Sep 30 13:49:38 crc kubenswrapper[4763]: I0930 13:49:38.352396 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jngk" event={"ID":"d761c91a-ac32-4853-9045-0a8fb9df18c6","Type":"ContainerStarted","Data":"4601d2d66ac248c15cb5726beb191b109adb132e99e652876cec87e67df1a132"} Sep 30 13:49:38 crc kubenswrapper[4763]: I0930 13:49:38.352408 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jngk" event={"ID":"d761c91a-ac32-4853-9045-0a8fb9df18c6","Type":"ContainerStarted","Data":"d5948793e6aa11346d53df7d310760ad7b8e3dc5fb33ae13e9106a74e46a5b6c"} Sep 30 13:49:38 crc kubenswrapper[4763]: I0930 13:49:38.352420 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jngk" event={"ID":"d761c91a-ac32-4853-9045-0a8fb9df18c6","Type":"ContainerStarted","Data":"2a52a6d2e42a9bb4f65fb0125f226b65f5df1ff7d3f6e7bbdabc5a862f69462d"} Sep 30 13:49:38 crc kubenswrapper[4763]: I0930 13:49:38.352430 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jngk" event={"ID":"d761c91a-ac32-4853-9045-0a8fb9df18c6","Type":"ContainerStarted","Data":"9675db1751077480e5e33922ffb7b5a8dbf7d061354e8d116b2613e281c93d3e"} Sep 30 13:49:38 crc kubenswrapper[4763]: I0930 13:49:38.723847 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-c6gdx" Sep 30 13:49:39 crc kubenswrapper[4763]: I0930 13:49:39.361886 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8jngk" event={"ID":"d761c91a-ac32-4853-9045-0a8fb9df18c6","Type":"ContainerStarted","Data":"c188f4075faf9643ef0925585bb37fe05e7640651d6d453e89a98bdb6f1f84fc"} Sep 30 13:49:39 crc kubenswrapper[4763]: I0930 13:49:39.363192 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:39 crc kubenswrapper[4763]: I0930 13:49:39.387292 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8jngk" podStartSLOduration=5.569404291 podStartE2EDuration="13.387275815s" podCreationTimestamp="2025-09-30 13:49:26 +0000 UTC" firstStartedPulling="2025-09-30 13:49:27.198312199 +0000 UTC m=+839.336872484" lastFinishedPulling="2025-09-30 13:49:35.016183723 +0000 UTC m=+847.154744008" observedRunningTime="2025-09-30 13:49:39.383181901 +0000 UTC m=+851.521742206" watchObservedRunningTime="2025-09-30 13:49:39.387275815 +0000 UTC m=+851.525836100" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.053241 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h"] Sep 30 13:49:40 crc kubenswrapper[4763]: E0930 13:49:40.053862 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerName="extract-content" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.053890 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerName="extract-content" Sep 30 13:49:40 crc kubenswrapper[4763]: E0930 13:49:40.053923 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerName="extract-utilities" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.053932 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerName="extract-utilities" Sep 30 13:49:40 crc kubenswrapper[4763]: E0930 13:49:40.053948 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerName="registry-server" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.053957 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerName="registry-server" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.054090 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f725d9-bee0-4e1f-a2b5-856ea0546825" containerName="registry-server" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.055181 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.057584 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.064652 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h"] Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.190664 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbwv9\" (UniqueName: \"kubernetes.io/projected/42bb402e-356c-4746-ad86-991264de21e7-kube-api-access-vbwv9\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.190771 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.191190 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.292041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.292123 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbwv9\" (UniqueName: \"kubernetes.io/projected/42bb402e-356c-4746-ad86-991264de21e7-kube-api-access-vbwv9\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.292200 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.293490 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.293770 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.314495 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbwv9\" (UniqueName: \"kubernetes.io/projected/42bb402e-356c-4746-ad86-991264de21e7-kube-api-access-vbwv9\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.370662 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:40 crc kubenswrapper[4763]: I0930 13:49:40.652924 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h"] Sep 30 13:49:40 crc kubenswrapper[4763]: W0930 13:49:40.657415 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42bb402e_356c_4746_ad86_991264de21e7.slice/crio-6a5ad5077b5ef652ad9ff2b8703de6954600f0a3cc261da15e89a0126bfd4d23 WatchSource:0}: Error finding container 6a5ad5077b5ef652ad9ff2b8703de6954600f0a3cc261da15e89a0126bfd4d23: Status 404 returned error can't find the container with id 6a5ad5077b5ef652ad9ff2b8703de6954600f0a3cc261da15e89a0126bfd4d23 Sep 30 13:49:41 crc kubenswrapper[4763]: I0930 13:49:41.373314 4763 generic.go:334] "Generic (PLEG): container finished" podID="42bb402e-356c-4746-ad86-991264de21e7" containerID="db44df505e5ad26d3d7c847ae8ee927268083d529039721603177b261e25f0ef" exitCode=0 Sep 30 13:49:41 crc kubenswrapper[4763]: I0930 13:49:41.374487 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" event={"ID":"42bb402e-356c-4746-ad86-991264de21e7","Type":"ContainerDied","Data":"db44df505e5ad26d3d7c847ae8ee927268083d529039721603177b261e25f0ef"} Sep 30 13:49:41 crc kubenswrapper[4763]: I0930 13:49:41.374523 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" event={"ID":"42bb402e-356c-4746-ad86-991264de21e7","Type":"ContainerStarted","Data":"6a5ad5077b5ef652ad9ff2b8703de6954600f0a3cc261da15e89a0126bfd4d23"} Sep 30 13:49:42 crc kubenswrapper[4763]: I0930 13:49:42.012966 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:42 crc kubenswrapper[4763]: I0930 13:49:42.056067 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:45 crc kubenswrapper[4763]: I0930 13:49:45.401450 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" event={"ID":"42bb402e-356c-4746-ad86-991264de21e7","Type":"ContainerStarted","Data":"d81aa4398a2944c33c8c972b316d8992ba86b3195de4eb002e9279f345a1f83b"} Sep 30 13:49:46 crc kubenswrapper[4763]: I0930 13:49:46.408723 4763 generic.go:334] "Generic (PLEG): container finished" podID="42bb402e-356c-4746-ad86-991264de21e7" containerID="d81aa4398a2944c33c8c972b316d8992ba86b3195de4eb002e9279f345a1f83b" exitCode=0 Sep 30 13:49:46 crc kubenswrapper[4763]: I0930 13:49:46.408767 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" event={"ID":"42bb402e-356c-4746-ad86-991264de21e7","Type":"ContainerDied","Data":"d81aa4398a2944c33c8c972b316d8992ba86b3195de4eb002e9279f345a1f83b"} Sep 30 13:49:47 crc kubenswrapper[4763]: I0930 13:49:47.015589 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8jngk" Sep 30 13:49:47 crc kubenswrapper[4763]: I0930 13:49:47.049505 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bd98m" Sep 30 13:49:47 crc kubenswrapper[4763]: I0930 13:49:47.418905 4763 generic.go:334] "Generic (PLEG): container finished" podID="42bb402e-356c-4746-ad86-991264de21e7" containerID="647410790eea547f9cde1ad076ab32f3819210896259fda9b404b7d2beb57e08" exitCode=0 Sep 30 13:49:47 crc kubenswrapper[4763]: I0930 13:49:47.418972 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" event={"ID":"42bb402e-356c-4746-ad86-991264de21e7","Type":"ContainerDied","Data":"647410790eea547f9cde1ad076ab32f3819210896259fda9b404b7d2beb57e08"} Sep 30 13:49:48 crc kubenswrapper[4763]: I0930 13:49:48.825157 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:48 crc kubenswrapper[4763]: I0930 13:49:48.905465 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-util\") pod \"42bb402e-356c-4746-ad86-991264de21e7\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " Sep 30 13:49:48 crc kubenswrapper[4763]: I0930 13:49:48.905512 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-bundle\") pod \"42bb402e-356c-4746-ad86-991264de21e7\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " Sep 30 13:49:48 crc kubenswrapper[4763]: I0930 13:49:48.905556 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbwv9\" (UniqueName: \"kubernetes.io/projected/42bb402e-356c-4746-ad86-991264de21e7-kube-api-access-vbwv9\") pod \"42bb402e-356c-4746-ad86-991264de21e7\" (UID: \"42bb402e-356c-4746-ad86-991264de21e7\") " Sep 30 13:49:48 crc kubenswrapper[4763]: I0930 13:49:48.908145 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-bundle" (OuterVolumeSpecName: "bundle") pod "42bb402e-356c-4746-ad86-991264de21e7" (UID: "42bb402e-356c-4746-ad86-991264de21e7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:49:48 crc kubenswrapper[4763]: I0930 13:49:48.911391 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42bb402e-356c-4746-ad86-991264de21e7-kube-api-access-vbwv9" (OuterVolumeSpecName: "kube-api-access-vbwv9") pod "42bb402e-356c-4746-ad86-991264de21e7" (UID: "42bb402e-356c-4746-ad86-991264de21e7"). InnerVolumeSpecName "kube-api-access-vbwv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:49:48 crc kubenswrapper[4763]: I0930 13:49:48.916273 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-util" (OuterVolumeSpecName: "util") pod "42bb402e-356c-4746-ad86-991264de21e7" (UID: "42bb402e-356c-4746-ad86-991264de21e7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:49:49 crc kubenswrapper[4763]: I0930 13:49:49.007143 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-util\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:49 crc kubenswrapper[4763]: I0930 13:49:49.007187 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42bb402e-356c-4746-ad86-991264de21e7-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:49 crc kubenswrapper[4763]: I0930 13:49:49.007196 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbwv9\" (UniqueName: \"kubernetes.io/projected/42bb402e-356c-4746-ad86-991264de21e7-kube-api-access-vbwv9\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:49 crc kubenswrapper[4763]: I0930 13:49:49.431319 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" event={"ID":"42bb402e-356c-4746-ad86-991264de21e7","Type":"ContainerDied","Data":"6a5ad5077b5ef652ad9ff2b8703de6954600f0a3cc261da15e89a0126bfd4d23"} Sep 30 13:49:49 crc kubenswrapper[4763]: I0930 13:49:49.431356 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a5ad5077b5ef652ad9ff2b8703de6954600f0a3cc261da15e89a0126bfd4d23" Sep 30 13:49:49 crc kubenswrapper[4763]: I0930 13:49:49.431400 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.342461 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx"] Sep 30 13:49:53 crc kubenswrapper[4763]: E0930 13:49:53.343181 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42bb402e-356c-4746-ad86-991264de21e7" containerName="pull" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.343193 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bb402e-356c-4746-ad86-991264de21e7" containerName="pull" Sep 30 13:49:53 crc kubenswrapper[4763]: E0930 13:49:53.343207 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42bb402e-356c-4746-ad86-991264de21e7" containerName="util" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.343214 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bb402e-356c-4746-ad86-991264de21e7" containerName="util" Sep 30 13:49:53 crc kubenswrapper[4763]: E0930 13:49:53.343234 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42bb402e-356c-4746-ad86-991264de21e7" containerName="extract" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.343242 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bb402e-356c-4746-ad86-991264de21e7" containerName="extract" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.343361 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="42bb402e-356c-4746-ad86-991264de21e7" containerName="extract" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.343782 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.345949 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.345994 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.346065 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-hltbv" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.356310 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx"] Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.461755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdqhh\" (UniqueName: \"kubernetes.io/projected/e6ccebb5-42ef-4015-8966-20739f3f22ed-kube-api-access-pdqhh\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2w2rx\" (UID: \"e6ccebb5-42ef-4015-8966-20739f3f22ed\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.563161 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdqhh\" (UniqueName: \"kubernetes.io/projected/e6ccebb5-42ef-4015-8966-20739f3f22ed-kube-api-access-pdqhh\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2w2rx\" (UID: \"e6ccebb5-42ef-4015-8966-20739f3f22ed\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.596866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdqhh\" (UniqueName: \"kubernetes.io/projected/e6ccebb5-42ef-4015-8966-20739f3f22ed-kube-api-access-pdqhh\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2w2rx\" (UID: \"e6ccebb5-42ef-4015-8966-20739f3f22ed\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx" Sep 30 13:49:53 crc kubenswrapper[4763]: I0930 13:49:53.658274 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx" Sep 30 13:49:54 crc kubenswrapper[4763]: I0930 13:49:54.068681 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx"] Sep 30 13:49:54 crc kubenswrapper[4763]: W0930 13:49:54.077934 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6ccebb5_42ef_4015_8966_20739f3f22ed.slice/crio-7d525cca52f7b8690c9f5cb2c0d4a5d30b8d98b03ff78faf9db8a17a2e51b99c WatchSource:0}: Error finding container 7d525cca52f7b8690c9f5cb2c0d4a5d30b8d98b03ff78faf9db8a17a2e51b99c: Status 404 returned error can't find the container with id 7d525cca52f7b8690c9f5cb2c0d4a5d30b8d98b03ff78faf9db8a17a2e51b99c Sep 30 13:49:54 crc kubenswrapper[4763]: I0930 13:49:54.458621 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx" event={"ID":"e6ccebb5-42ef-4015-8966-20739f3f22ed","Type":"ContainerStarted","Data":"7d525cca52f7b8690c9f5cb2c0d4a5d30b8d98b03ff78faf9db8a17a2e51b99c"} Sep 30 13:50:02 crc kubenswrapper[4763]: I0930 13:50:02.522001 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx" event={"ID":"e6ccebb5-42ef-4015-8966-20739f3f22ed","Type":"ContainerStarted","Data":"59cb96d3a4a7e9ab36ffb530b0b40729e2b601adb0fabe7e5bf9e83a6efed2b7"} Sep 30 13:50:02 crc kubenswrapper[4763]: I0930 13:50:02.544829 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2w2rx" podStartSLOduration=2.198614697 podStartE2EDuration="9.544814552s" podCreationTimestamp="2025-09-30 13:49:53 +0000 UTC" firstStartedPulling="2025-09-30 13:49:54.081237566 +0000 UTC m=+866.219797851" lastFinishedPulling="2025-09-30 13:50:01.427437421 +0000 UTC m=+873.565997706" observedRunningTime="2025-09-30 13:50:02.541433706 +0000 UTC m=+874.679994011" watchObservedRunningTime="2025-09-30 13:50:02.544814552 +0000 UTC m=+874.683374837" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.173460 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-prxdp"] Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.174362 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-prxdp" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.179751 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.180260 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-nrnbg" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.180619 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.186825 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-prxdp"] Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.229574 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5fwc\" (UniqueName: \"kubernetes.io/projected/d74685c2-f75e-4c1e-84e7-63c7bc19f221-kube-api-access-z5fwc\") pod \"cert-manager-webhook-d969966f-prxdp\" (UID: \"d74685c2-f75e-4c1e-84e7-63c7bc19f221\") " pod="cert-manager/cert-manager-webhook-d969966f-prxdp" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.229637 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d74685c2-f75e-4c1e-84e7-63c7bc19f221-bound-sa-token\") pod \"cert-manager-webhook-d969966f-prxdp\" (UID: \"d74685c2-f75e-4c1e-84e7-63c7bc19f221\") " pod="cert-manager/cert-manager-webhook-d969966f-prxdp" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.330619 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5fwc\" (UniqueName: \"kubernetes.io/projected/d74685c2-f75e-4c1e-84e7-63c7bc19f221-kube-api-access-z5fwc\") pod \"cert-manager-webhook-d969966f-prxdp\" (UID: \"d74685c2-f75e-4c1e-84e7-63c7bc19f221\") " pod="cert-manager/cert-manager-webhook-d969966f-prxdp" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.330685 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d74685c2-f75e-4c1e-84e7-63c7bc19f221-bound-sa-token\") pod \"cert-manager-webhook-d969966f-prxdp\" (UID: \"d74685c2-f75e-4c1e-84e7-63c7bc19f221\") " pod="cert-manager/cert-manager-webhook-d969966f-prxdp" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.350133 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d74685c2-f75e-4c1e-84e7-63c7bc19f221-bound-sa-token\") pod \"cert-manager-webhook-d969966f-prxdp\" (UID: \"d74685c2-f75e-4c1e-84e7-63c7bc19f221\") " pod="cert-manager/cert-manager-webhook-d969966f-prxdp" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.352306 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5fwc\" (UniqueName: \"kubernetes.io/projected/d74685c2-f75e-4c1e-84e7-63c7bc19f221-kube-api-access-z5fwc\") pod \"cert-manager-webhook-d969966f-prxdp\" (UID: \"d74685c2-f75e-4c1e-84e7-63c7bc19f221\") " pod="cert-manager/cert-manager-webhook-d969966f-prxdp" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.491092 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-prxdp" Sep 30 13:50:05 crc kubenswrapper[4763]: I0930 13:50:05.925909 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-prxdp"] Sep 30 13:50:05 crc kubenswrapper[4763]: W0930 13:50:05.932331 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd74685c2_f75e_4c1e_84e7_63c7bc19f221.slice/crio-ea970c20a21b2b082ffbe45e9d04cf55490d9aa3a2209662c1c696d490485ec4 WatchSource:0}: Error finding container ea970c20a21b2b082ffbe45e9d04cf55490d9aa3a2209662c1c696d490485ec4: Status 404 returned error can't find the container with id ea970c20a21b2b082ffbe45e9d04cf55490d9aa3a2209662c1c696d490485ec4 Sep 30 13:50:06 crc kubenswrapper[4763]: I0930 13:50:06.060273 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:50:06 crc kubenswrapper[4763]: I0930 13:50:06.060370 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:50:06 crc kubenswrapper[4763]: I0930 13:50:06.543502 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-prxdp" event={"ID":"d74685c2-f75e-4c1e-84e7-63c7bc19f221","Type":"ContainerStarted","Data":"ea970c20a21b2b082ffbe45e9d04cf55490d9aa3a2209662c1c696d490485ec4"} Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.543712 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz"] Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.544445 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.548971 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-gsn2b" Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.559771 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz"] Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.577015 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53d7cd58-3d29-4172-8ecf-f7117a00e79f-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-gt2hz\" (UID: \"53d7cd58-3d29-4172-8ecf-f7117a00e79f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.577100 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rhht\" (UniqueName: \"kubernetes.io/projected/53d7cd58-3d29-4172-8ecf-f7117a00e79f-kube-api-access-8rhht\") pod \"cert-manager-cainjector-7d9f95dbf-gt2hz\" (UID: \"53d7cd58-3d29-4172-8ecf-f7117a00e79f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.678877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53d7cd58-3d29-4172-8ecf-f7117a00e79f-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-gt2hz\" (UID: \"53d7cd58-3d29-4172-8ecf-f7117a00e79f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.678963 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rhht\" (UniqueName: \"kubernetes.io/projected/53d7cd58-3d29-4172-8ecf-f7117a00e79f-kube-api-access-8rhht\") pod \"cert-manager-cainjector-7d9f95dbf-gt2hz\" (UID: \"53d7cd58-3d29-4172-8ecf-f7117a00e79f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.696979 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53d7cd58-3d29-4172-8ecf-f7117a00e79f-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-gt2hz\" (UID: \"53d7cd58-3d29-4172-8ecf-f7117a00e79f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.705321 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rhht\" (UniqueName: \"kubernetes.io/projected/53d7cd58-3d29-4172-8ecf-f7117a00e79f-kube-api-access-8rhht\") pod \"cert-manager-cainjector-7d9f95dbf-gt2hz\" (UID: \"53d7cd58-3d29-4172-8ecf-f7117a00e79f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" Sep 30 13:50:08 crc kubenswrapper[4763]: I0930 13:50:08.859031 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" Sep 30 13:50:10 crc kubenswrapper[4763]: I0930 13:50:10.675196 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz"] Sep 30 13:50:10 crc kubenswrapper[4763]: W0930 13:50:10.680812 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d7cd58_3d29_4172_8ecf_f7117a00e79f.slice/crio-41c70e32d65440e03b4e660f93c2e2cb561b9613ee59d46237956c86730b37c2 WatchSource:0}: Error finding container 41c70e32d65440e03b4e660f93c2e2cb561b9613ee59d46237956c86730b37c2: Status 404 returned error can't find the container with id 41c70e32d65440e03b4e660f93c2e2cb561b9613ee59d46237956c86730b37c2 Sep 30 13:50:11 crc kubenswrapper[4763]: I0930 13:50:11.571401 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" event={"ID":"53d7cd58-3d29-4172-8ecf-f7117a00e79f","Type":"ContainerStarted","Data":"754313b4c4d943a3fe4f7198786758f27587739a48ff7f159cae52451555c078"} Sep 30 13:50:11 crc kubenswrapper[4763]: I0930 13:50:11.571461 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" event={"ID":"53d7cd58-3d29-4172-8ecf-f7117a00e79f","Type":"ContainerStarted","Data":"41c70e32d65440e03b4e660f93c2e2cb561b9613ee59d46237956c86730b37c2"} Sep 30 13:50:11 crc kubenswrapper[4763]: I0930 13:50:11.573337 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-prxdp" event={"ID":"d74685c2-f75e-4c1e-84e7-63c7bc19f221","Type":"ContainerStarted","Data":"90a1d7888042637db72538363f020159b352ed3f857671dd9bfeed1f8abc3896"} Sep 30 13:50:11 crc kubenswrapper[4763]: I0930 13:50:11.573468 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-prxdp" Sep 30 13:50:11 crc kubenswrapper[4763]: I0930 13:50:11.590786 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gt2hz" podStartSLOduration=3.590760059 podStartE2EDuration="3.590760059s" podCreationTimestamp="2025-09-30 13:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:50:11.587077906 +0000 UTC m=+883.725638191" watchObservedRunningTime="2025-09-30 13:50:11.590760059 +0000 UTC m=+883.729320334" Sep 30 13:50:11 crc kubenswrapper[4763]: I0930 13:50:11.605987 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-prxdp" podStartSLOduration=1.8201967300000002 podStartE2EDuration="6.605964164s" podCreationTimestamp="2025-09-30 13:50:05 +0000 UTC" firstStartedPulling="2025-09-30 13:50:05.934161628 +0000 UTC m=+878.072721913" lastFinishedPulling="2025-09-30 13:50:10.719929062 +0000 UTC m=+882.858489347" observedRunningTime="2025-09-30 13:50:11.605188535 +0000 UTC m=+883.743748820" watchObservedRunningTime="2025-09-30 13:50:11.605964164 +0000 UTC m=+883.744524459" Sep 30 13:50:15 crc kubenswrapper[4763]: I0930 13:50:15.498073 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-prxdp" Sep 30 13:50:24 crc kubenswrapper[4763]: I0930 13:50:24.719339 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-9kkpz"] Sep 30 13:50:24 crc kubenswrapper[4763]: I0930 13:50:24.720449 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" Sep 30 13:50:24 crc kubenswrapper[4763]: I0930 13:50:24.726130 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dgd5l" Sep 30 13:50:24 crc kubenswrapper[4763]: I0930 13:50:24.734452 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-9kkpz"] Sep 30 13:50:24 crc kubenswrapper[4763]: I0930 13:50:24.889691 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88dbbf99-11f5-4a82-95d8-a0c0e97d7d76-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-9kkpz\" (UID: \"88dbbf99-11f5-4a82-95d8-a0c0e97d7d76\") " pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" Sep 30 13:50:24 crc kubenswrapper[4763]: I0930 13:50:24.889748 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx5h6\" (UniqueName: \"kubernetes.io/projected/88dbbf99-11f5-4a82-95d8-a0c0e97d7d76-kube-api-access-gx5h6\") pod \"cert-manager-7d4cc89fcb-9kkpz\" (UID: \"88dbbf99-11f5-4a82-95d8-a0c0e97d7d76\") " pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" Sep 30 13:50:24 crc kubenswrapper[4763]: I0930 13:50:24.990402 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88dbbf99-11f5-4a82-95d8-a0c0e97d7d76-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-9kkpz\" (UID: \"88dbbf99-11f5-4a82-95d8-a0c0e97d7d76\") " pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" Sep 30 13:50:24 crc kubenswrapper[4763]: I0930 13:50:24.990469 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx5h6\" (UniqueName: \"kubernetes.io/projected/88dbbf99-11f5-4a82-95d8-a0c0e97d7d76-kube-api-access-gx5h6\") pod \"cert-manager-7d4cc89fcb-9kkpz\" (UID: \"88dbbf99-11f5-4a82-95d8-a0c0e97d7d76\") " pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" Sep 30 13:50:25 crc kubenswrapper[4763]: I0930 13:50:25.009493 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88dbbf99-11f5-4a82-95d8-a0c0e97d7d76-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-9kkpz\" (UID: \"88dbbf99-11f5-4a82-95d8-a0c0e97d7d76\") " pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" Sep 30 13:50:25 crc kubenswrapper[4763]: I0930 13:50:25.009927 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx5h6\" (UniqueName: \"kubernetes.io/projected/88dbbf99-11f5-4a82-95d8-a0c0e97d7d76-kube-api-access-gx5h6\") pod \"cert-manager-7d4cc89fcb-9kkpz\" (UID: \"88dbbf99-11f5-4a82-95d8-a0c0e97d7d76\") " pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" Sep 30 13:50:25 crc kubenswrapper[4763]: I0930 13:50:25.052118 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" Sep 30 13:50:25 crc kubenswrapper[4763]: I0930 13:50:25.255522 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-9kkpz"] Sep 30 13:50:25 crc kubenswrapper[4763]: W0930 13:50:25.260028 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88dbbf99_11f5_4a82_95d8_a0c0e97d7d76.slice/crio-84929dc1d62a168ec7a6be350475ec9223733bc71731dcd2da317d9fcb46b5ec WatchSource:0}: Error finding container 84929dc1d62a168ec7a6be350475ec9223733bc71731dcd2da317d9fcb46b5ec: Status 404 returned error can't find the container with id 84929dc1d62a168ec7a6be350475ec9223733bc71731dcd2da317d9fcb46b5ec Sep 30 13:50:25 crc kubenswrapper[4763]: I0930 13:50:25.653083 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" event={"ID":"88dbbf99-11f5-4a82-95d8-a0c0e97d7d76","Type":"ContainerStarted","Data":"0a682e783629eab57b2f0ad85add1b6163f68d71146575f27cdb8e67c96dc4fc"} Sep 30 13:50:25 crc kubenswrapper[4763]: I0930 13:50:25.653136 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" event={"ID":"88dbbf99-11f5-4a82-95d8-a0c0e97d7d76","Type":"ContainerStarted","Data":"84929dc1d62a168ec7a6be350475ec9223733bc71731dcd2da317d9fcb46b5ec"} Sep 30 13:50:25 crc kubenswrapper[4763]: I0930 13:50:25.670424 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-9kkpz" podStartSLOduration=1.6703994199999999 podStartE2EDuration="1.67039942s" podCreationTimestamp="2025-09-30 13:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:50:25.666743988 +0000 UTC m=+897.805304353" watchObservedRunningTime="2025-09-30 13:50:25.67039942 +0000 UTC m=+897.808959715" Sep 30 13:50:28 crc kubenswrapper[4763]: I0930 13:50:28.628842 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2fvwg"] Sep 30 13:50:28 crc kubenswrapper[4763]: I0930 13:50:28.629844 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2fvwg" Sep 30 13:50:28 crc kubenswrapper[4763]: I0930 13:50:28.632761 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 13:50:28 crc kubenswrapper[4763]: I0930 13:50:28.632788 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 13:50:28 crc kubenswrapper[4763]: I0930 13:50:28.633000 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-vtppn" Sep 30 13:50:28 crc kubenswrapper[4763]: I0930 13:50:28.639944 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwx59\" (UniqueName: \"kubernetes.io/projected/71ff0b7a-7f8e-4a64-8c27-36c18124346d-kube-api-access-zwx59\") pod \"openstack-operator-index-2fvwg\" (UID: \"71ff0b7a-7f8e-4a64-8c27-36c18124346d\") " pod="openstack-operators/openstack-operator-index-2fvwg" Sep 30 13:50:28 crc kubenswrapper[4763]: I0930 13:50:28.644568 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2fvwg"] Sep 30 13:50:28 crc kubenswrapper[4763]: I0930 13:50:28.740946 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwx59\" (UniqueName: \"kubernetes.io/projected/71ff0b7a-7f8e-4a64-8c27-36c18124346d-kube-api-access-zwx59\") pod \"openstack-operator-index-2fvwg\" (UID: \"71ff0b7a-7f8e-4a64-8c27-36c18124346d\") " pod="openstack-operators/openstack-operator-index-2fvwg" Sep 30 13:50:28 crc kubenswrapper[4763]: I0930 13:50:28.815938 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwx59\" (UniqueName: \"kubernetes.io/projected/71ff0b7a-7f8e-4a64-8c27-36c18124346d-kube-api-access-zwx59\") pod \"openstack-operator-index-2fvwg\" (UID: \"71ff0b7a-7f8e-4a64-8c27-36c18124346d\") " pod="openstack-operators/openstack-operator-index-2fvwg" Sep 30 13:50:29 crc kubenswrapper[4763]: I0930 13:50:29.110931 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2fvwg" Sep 30 13:50:29 crc kubenswrapper[4763]: I0930 13:50:29.492897 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2fvwg"] Sep 30 13:50:29 crc kubenswrapper[4763]: I0930 13:50:29.676873 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2fvwg" event={"ID":"71ff0b7a-7f8e-4a64-8c27-36c18124346d","Type":"ContainerStarted","Data":"d67b13c09f86c1c6c68f5a90f7e1db1c5382ea4dbe0d54a85486585fa15e5348"} Sep 30 13:50:32 crc kubenswrapper[4763]: I0930 13:50:32.014042 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2fvwg"] Sep 30 13:50:32 crc kubenswrapper[4763]: I0930 13:50:32.616461 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9bbxd"] Sep 30 13:50:32 crc kubenswrapper[4763]: I0930 13:50:32.617494 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9bbxd" Sep 30 13:50:32 crc kubenswrapper[4763]: I0930 13:50:32.629148 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9bbxd"] Sep 30 13:50:32 crc kubenswrapper[4763]: I0930 13:50:32.694420 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvrv\" (UniqueName: \"kubernetes.io/projected/203eb617-841b-4e63-8258-10ddfde53da0-kube-api-access-nfvrv\") pod \"openstack-operator-index-9bbxd\" (UID: \"203eb617-841b-4e63-8258-10ddfde53da0\") " pod="openstack-operators/openstack-operator-index-9bbxd" Sep 30 13:50:32 crc kubenswrapper[4763]: I0930 13:50:32.795807 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvrv\" (UniqueName: \"kubernetes.io/projected/203eb617-841b-4e63-8258-10ddfde53da0-kube-api-access-nfvrv\") pod \"openstack-operator-index-9bbxd\" (UID: \"203eb617-841b-4e63-8258-10ddfde53da0\") " pod="openstack-operators/openstack-operator-index-9bbxd" Sep 30 13:50:32 crc kubenswrapper[4763]: I0930 13:50:32.819949 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvrv\" (UniqueName: \"kubernetes.io/projected/203eb617-841b-4e63-8258-10ddfde53da0-kube-api-access-nfvrv\") pod \"openstack-operator-index-9bbxd\" (UID: \"203eb617-841b-4e63-8258-10ddfde53da0\") " pod="openstack-operators/openstack-operator-index-9bbxd" Sep 30 13:50:32 crc kubenswrapper[4763]: I0930 13:50:32.941834 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9bbxd" Sep 30 13:50:35 crc kubenswrapper[4763]: I0930 13:50:35.524079 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9bbxd"] Sep 30 13:50:35 crc kubenswrapper[4763]: I0930 13:50:35.714395 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9bbxd" event={"ID":"203eb617-841b-4e63-8258-10ddfde53da0","Type":"ContainerStarted","Data":"a38259f2546b980fdafb0adb0eb5a50b1203c6e4f419324fc25e19fd224a9590"} Sep 30 13:50:35 crc kubenswrapper[4763]: I0930 13:50:35.716305 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2fvwg" event={"ID":"71ff0b7a-7f8e-4a64-8c27-36c18124346d","Type":"ContainerStarted","Data":"9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb"} Sep 30 13:50:35 crc kubenswrapper[4763]: I0930 13:50:35.716430 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-2fvwg" podUID="71ff0b7a-7f8e-4a64-8c27-36c18124346d" containerName="registry-server" containerID="cri-o://9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb" gracePeriod=2 Sep 30 13:50:35 crc kubenswrapper[4763]: I0930 13:50:35.731658 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2fvwg" podStartSLOduration=1.814686897 podStartE2EDuration="7.731644403s" podCreationTimestamp="2025-09-30 13:50:28 +0000 UTC" firstStartedPulling="2025-09-30 13:50:29.499457283 +0000 UTC m=+901.638017568" lastFinishedPulling="2025-09-30 13:50:35.416414789 +0000 UTC m=+907.554975074" observedRunningTime="2025-09-30 13:50:35.727978361 +0000 UTC m=+907.866538666" watchObservedRunningTime="2025-09-30 13:50:35.731644403 +0000 UTC m=+907.870204688" Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.049128 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2fvwg" Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.060197 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.060250 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.240941 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwx59\" (UniqueName: \"kubernetes.io/projected/71ff0b7a-7f8e-4a64-8c27-36c18124346d-kube-api-access-zwx59\") pod \"71ff0b7a-7f8e-4a64-8c27-36c18124346d\" (UID: \"71ff0b7a-7f8e-4a64-8c27-36c18124346d\") " Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.249363 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ff0b7a-7f8e-4a64-8c27-36c18124346d-kube-api-access-zwx59" (OuterVolumeSpecName: "kube-api-access-zwx59") pod "71ff0b7a-7f8e-4a64-8c27-36c18124346d" (UID: "71ff0b7a-7f8e-4a64-8c27-36c18124346d"). InnerVolumeSpecName "kube-api-access-zwx59". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.342710 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwx59\" (UniqueName: \"kubernetes.io/projected/71ff0b7a-7f8e-4a64-8c27-36c18124346d-kube-api-access-zwx59\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.725185 4763 generic.go:334] "Generic (PLEG): container finished" podID="71ff0b7a-7f8e-4a64-8c27-36c18124346d" containerID="9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb" exitCode=0 Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.725268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2fvwg" event={"ID":"71ff0b7a-7f8e-4a64-8c27-36c18124346d","Type":"ContainerDied","Data":"9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb"} Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.725307 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2fvwg" event={"ID":"71ff0b7a-7f8e-4a64-8c27-36c18124346d","Type":"ContainerDied","Data":"d67b13c09f86c1c6c68f5a90f7e1db1c5382ea4dbe0d54a85486585fa15e5348"} Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.725329 4763 scope.go:117] "RemoveContainer" containerID="9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb" Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.725459 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2fvwg" Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.728142 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9bbxd" event={"ID":"203eb617-841b-4e63-8258-10ddfde53da0","Type":"ContainerStarted","Data":"b307a337f7246a38330cbeb16f6419955fd667918eb48efbbfba1613537d76d0"} Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.765021 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9bbxd" podStartSLOduration=4.3753968069999996 podStartE2EDuration="4.764993795s" podCreationTimestamp="2025-09-30 13:50:32 +0000 UTC" firstStartedPulling="2025-09-30 13:50:35.541035125 +0000 UTC m=+907.679595460" lastFinishedPulling="2025-09-30 13:50:35.930632123 +0000 UTC m=+908.069192448" observedRunningTime="2025-09-30 13:50:36.763427596 +0000 UTC m=+908.901987881" watchObservedRunningTime="2025-09-30 13:50:36.764993795 +0000 UTC m=+908.903554080" Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.765480 4763 scope.go:117] "RemoveContainer" containerID="9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb" Sep 30 13:50:36 crc kubenswrapper[4763]: E0930 13:50:36.766124 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb\": container with ID starting with 9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb not found: ID does not exist" containerID="9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb" Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.766195 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb"} err="failed to get container status \"9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb\": rpc error: code = NotFound desc = could not find container \"9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb\": container with ID starting with 9c9616d5a144ce3b0bd9adb459ef11ef8acebcef5144fc79f317d49f3f2443bb not found: ID does not exist" Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.783868 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2fvwg"] Sep 30 13:50:36 crc kubenswrapper[4763]: I0930 13:50:36.788030 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-2fvwg"] Sep 30 13:50:38 crc kubenswrapper[4763]: I0930 13:50:38.497469 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ff0b7a-7f8e-4a64-8c27-36c18124346d" path="/var/lib/kubelet/pods/71ff0b7a-7f8e-4a64-8c27-36c18124346d/volumes" Sep 30 13:50:42 crc kubenswrapper[4763]: I0930 13:50:42.941996 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-9bbxd" Sep 30 13:50:42 crc kubenswrapper[4763]: I0930 13:50:42.942361 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-9bbxd" Sep 30 13:50:42 crc kubenswrapper[4763]: I0930 13:50:42.981135 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-9bbxd" Sep 30 13:50:43 crc kubenswrapper[4763]: I0930 13:50:43.807679 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-9bbxd" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.263491 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf"] Sep 30 13:51:04 crc kubenswrapper[4763]: E0930 13:51:04.264549 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ff0b7a-7f8e-4a64-8c27-36c18124346d" containerName="registry-server" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.264570 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ff0b7a-7f8e-4a64-8c27-36c18124346d" containerName="registry-server" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.264753 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ff0b7a-7f8e-4a64-8c27-36c18124346d" containerName="registry-server" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.265931 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.267951 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-zlxk6" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.273338 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf"] Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.409190 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-bundle\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.409262 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-util\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.409614 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpknv\" (UniqueName: \"kubernetes.io/projected/e99c5c15-af06-4688-8573-2faf4351d2d0-kube-api-access-hpknv\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.510512 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpknv\" (UniqueName: \"kubernetes.io/projected/e99c5c15-af06-4688-8573-2faf4351d2d0-kube-api-access-hpknv\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.510568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-bundle\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.510619 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-util\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.511257 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-bundle\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.511275 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-util\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.534986 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpknv\" (UniqueName: \"kubernetes.io/projected/e99c5c15-af06-4688-8573-2faf4351d2d0-kube-api-access-hpknv\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.597663 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.811830 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf"] Sep 30 13:51:04 crc kubenswrapper[4763]: I0930 13:51:04.936813 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" event={"ID":"e99c5c15-af06-4688-8573-2faf4351d2d0","Type":"ContainerStarted","Data":"ac253e24f15e55c7beb8504d55a85f00e5c2ba179513a092734879cc834bc67f"} Sep 30 13:51:05 crc kubenswrapper[4763]: I0930 13:51:05.944652 4763 generic.go:334] "Generic (PLEG): container finished" podID="e99c5c15-af06-4688-8573-2faf4351d2d0" containerID="b1fbbd279fc59d7d2312f43742c01bb67273ca8c86748b9616139f604ff02f0d" exitCode=0 Sep 30 13:51:05 crc kubenswrapper[4763]: I0930 13:51:05.944713 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" event={"ID":"e99c5c15-af06-4688-8573-2faf4351d2d0","Type":"ContainerDied","Data":"b1fbbd279fc59d7d2312f43742c01bb67273ca8c86748b9616139f604ff02f0d"} Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.060268 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.060346 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.060404 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.061214 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d22e10272d584e0d311db86eff7ac75db8f98341b6f7b1a40cf7584027c1ba8"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.061328 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://2d22e10272d584e0d311db86eff7ac75db8f98341b6f7b1a40cf7584027c1ba8" gracePeriod=600 Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.953450 4763 generic.go:334] "Generic (PLEG): container finished" podID="e99c5c15-af06-4688-8573-2faf4351d2d0" containerID="d97f566448734e223bd2a63aded676b50ef2fbfb63930509dfcd6ac93a640c8f" exitCode=0 Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.953529 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" event={"ID":"e99c5c15-af06-4688-8573-2faf4351d2d0","Type":"ContainerDied","Data":"d97f566448734e223bd2a63aded676b50ef2fbfb63930509dfcd6ac93a640c8f"} Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.963150 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="2d22e10272d584e0d311db86eff7ac75db8f98341b6f7b1a40cf7584027c1ba8" exitCode=0 Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.963185 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"2d22e10272d584e0d311db86eff7ac75db8f98341b6f7b1a40cf7584027c1ba8"} Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.963228 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"929286b0798b4123a28e4fd7afc0d057a5a3facafe7726db3c5285288ca63279"} Sep 30 13:51:06 crc kubenswrapper[4763]: I0930 13:51:06.963256 4763 scope.go:117] "RemoveContainer" containerID="f66002987a3e708ee53022f61f57bc4019ea893682ccce020d3b5027a63c2bf8" Sep 30 13:51:07 crc kubenswrapper[4763]: I0930 13:51:07.973330 4763 generic.go:334] "Generic (PLEG): container finished" podID="e99c5c15-af06-4688-8573-2faf4351d2d0" containerID="90bf60f73d47d8e8b68b7d2c42d69cf2b00c2580f48dcf7c5a1b1799efaad234" exitCode=0 Sep 30 13:51:07 crc kubenswrapper[4763]: I0930 13:51:07.973401 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" event={"ID":"e99c5c15-af06-4688-8573-2faf4351d2d0","Type":"ContainerDied","Data":"90bf60f73d47d8e8b68b7d2c42d69cf2b00c2580f48dcf7c5a1b1799efaad234"} Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.229011 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.377404 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpknv\" (UniqueName: \"kubernetes.io/projected/e99c5c15-af06-4688-8573-2faf4351d2d0-kube-api-access-hpknv\") pod \"e99c5c15-af06-4688-8573-2faf4351d2d0\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.377635 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-util\") pod \"e99c5c15-af06-4688-8573-2faf4351d2d0\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.377714 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-bundle\") pod \"e99c5c15-af06-4688-8573-2faf4351d2d0\" (UID: \"e99c5c15-af06-4688-8573-2faf4351d2d0\") " Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.378974 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-bundle" (OuterVolumeSpecName: "bundle") pod "e99c5c15-af06-4688-8573-2faf4351d2d0" (UID: "e99c5c15-af06-4688-8573-2faf4351d2d0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.384132 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e99c5c15-af06-4688-8573-2faf4351d2d0-kube-api-access-hpknv" (OuterVolumeSpecName: "kube-api-access-hpknv") pod "e99c5c15-af06-4688-8573-2faf4351d2d0" (UID: "e99c5c15-af06-4688-8573-2faf4351d2d0"). InnerVolumeSpecName "kube-api-access-hpknv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.393034 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-util" (OuterVolumeSpecName: "util") pod "e99c5c15-af06-4688-8573-2faf4351d2d0" (UID: "e99c5c15-af06-4688-8573-2faf4351d2d0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.479145 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpknv\" (UniqueName: \"kubernetes.io/projected/e99c5c15-af06-4688-8573-2faf4351d2d0-kube-api-access-hpknv\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.479194 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-util\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.479207 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e99c5c15-af06-4688-8573-2faf4351d2d0-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.997341 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" event={"ID":"e99c5c15-af06-4688-8573-2faf4351d2d0","Type":"ContainerDied","Data":"ac253e24f15e55c7beb8504d55a85f00e5c2ba179513a092734879cc834bc67f"} Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.997415 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac253e24f15e55c7beb8504d55a85f00e5c2ba179513a092734879cc834bc67f" Sep 30 13:51:09 crc kubenswrapper[4763]: I0930 13:51:09.997508 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.403316 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll"] Sep 30 13:51:17 crc kubenswrapper[4763]: E0930 13:51:17.404079 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99c5c15-af06-4688-8573-2faf4351d2d0" containerName="extract" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.404096 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99c5c15-af06-4688-8573-2faf4351d2d0" containerName="extract" Sep 30 13:51:17 crc kubenswrapper[4763]: E0930 13:51:17.404113 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99c5c15-af06-4688-8573-2faf4351d2d0" containerName="util" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.404121 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99c5c15-af06-4688-8573-2faf4351d2d0" containerName="util" Sep 30 13:51:17 crc kubenswrapper[4763]: E0930 13:51:17.404135 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99c5c15-af06-4688-8573-2faf4351d2d0" containerName="pull" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.404143 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99c5c15-af06-4688-8573-2faf4351d2d0" containerName="pull" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.404297 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e99c5c15-af06-4688-8573-2faf4351d2d0" containerName="extract" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.405098 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.407034 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-wh8xm" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.438981 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll"] Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.597977 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdslm\" (UniqueName: \"kubernetes.io/projected/db30003e-6ca7-40a4-a3cf-9487f505109b-kube-api-access-bdslm\") pod \"openstack-operator-controller-operator-56dc567787-vbrll\" (UID: \"db30003e-6ca7-40a4-a3cf-9487f505109b\") " pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.698933 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdslm\" (UniqueName: \"kubernetes.io/projected/db30003e-6ca7-40a4-a3cf-9487f505109b-kube-api-access-bdslm\") pod \"openstack-operator-controller-operator-56dc567787-vbrll\" (UID: \"db30003e-6ca7-40a4-a3cf-9487f505109b\") " pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.722812 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdslm\" (UniqueName: \"kubernetes.io/projected/db30003e-6ca7-40a4-a3cf-9487f505109b-kube-api-access-bdslm\") pod \"openstack-operator-controller-operator-56dc567787-vbrll\" (UID: \"db30003e-6ca7-40a4-a3cf-9487f505109b\") " pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" Sep 30 13:51:17 crc kubenswrapper[4763]: I0930 13:51:17.724334 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" Sep 30 13:51:18 crc kubenswrapper[4763]: I0930 13:51:18.165291 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll"] Sep 30 13:51:19 crc kubenswrapper[4763]: I0930 13:51:19.054526 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" event={"ID":"db30003e-6ca7-40a4-a3cf-9487f505109b","Type":"ContainerStarted","Data":"b56023203026d1ad4ad540a54b3cc94d42af7655501dc69eb449ee9e81c2f7a7"} Sep 30 13:51:23 crc kubenswrapper[4763]: I0930 13:51:23.082647 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" event={"ID":"db30003e-6ca7-40a4-a3cf-9487f505109b","Type":"ContainerStarted","Data":"c6a0a066a116991928d582f11d38312c0c16baf28ed2d3557605bcc915f10d4e"} Sep 30 13:51:25 crc kubenswrapper[4763]: I0930 13:51:25.098509 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" event={"ID":"db30003e-6ca7-40a4-a3cf-9487f505109b","Type":"ContainerStarted","Data":"72e8c7cd323111178b3df9e569db6ce811064a2576f21194f7c05802259e07cb"} Sep 30 13:51:25 crc kubenswrapper[4763]: I0930 13:51:25.098851 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" Sep 30 13:51:25 crc kubenswrapper[4763]: I0930 13:51:25.129419 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" podStartSLOduration=1.3951892510000001 podStartE2EDuration="8.129402716s" podCreationTimestamp="2025-09-30 13:51:17 +0000 UTC" firstStartedPulling="2025-09-30 13:51:18.185363326 +0000 UTC m=+950.323923611" lastFinishedPulling="2025-09-30 13:51:24.919576781 +0000 UTC m=+957.058137076" observedRunningTime="2025-09-30 13:51:25.126128062 +0000 UTC m=+957.264688347" watchObservedRunningTime="2025-09-30 13:51:25.129402716 +0000 UTC m=+957.267963001" Sep 30 13:51:27 crc kubenswrapper[4763]: I0930 13:51:27.727048 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-vbrll" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.905459 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j"] Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.907089 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.910437 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r"] Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.911539 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-m7c5s" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.911857 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.917961 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4pwqq" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.923841 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv"] Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.925117 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.926977 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j"] Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.928249 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-bqs8c" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.930291 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r"] Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.946675 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv"] Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.970234 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn"] Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.971483 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.975495 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-ch5vd" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.982824 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6"] Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.984167 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.990446 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-9n88v" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.990641 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5"] Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.991567 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" Sep 30 13:51:46 crc kubenswrapper[4763]: I0930 13:51:46.997649 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qn7ms" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.010348 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bhnh\" (UniqueName: \"kubernetes.io/projected/a4cba4a2-dc1b-485e-b141-a4d7f82176ac-kube-api-access-7bhnh\") pod \"designate-operator-controller-manager-77fb7bcf5b-lhnzv\" (UID: \"a4cba4a2-dc1b-485e-b141-a4d7f82176ac\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.010407 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkr6n\" (UniqueName: \"kubernetes.io/projected/3b13bc76-0bcb-48f3-9e18-f04720087325-kube-api-access-pkr6n\") pod \"heat-operator-controller-manager-5b4fc86755-hkxs6\" (UID: \"3b13bc76-0bcb-48f3-9e18-f04720087325\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.010482 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s99ll\" (UniqueName: \"kubernetes.io/projected/29c17248-6b6c-4ab7-8204-0f5d34a30da5-kube-api-access-s99ll\") pod \"barbican-operator-controller-manager-f7f98cb69-8s74j\" (UID: \"29c17248-6b6c-4ab7-8204-0f5d34a30da5\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.010532 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5cjn\" (UniqueName: \"kubernetes.io/projected/ea3d1e11-a06c-4cc4-af77-725fdafb57c8-kube-api-access-l5cjn\") pod \"horizon-operator-controller-manager-679b4759bb-8ftj5\" (UID: \"ea3d1e11-a06c-4cc4-af77-725fdafb57c8\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.010590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsq5b\" (UniqueName: \"kubernetes.io/projected/f38cae68-c345-48f7-9be3-ea9467cb5485-kube-api-access-jsq5b\") pod \"cinder-operator-controller-manager-859cd486d-rpc2r\" (UID: \"f38cae68-c345-48f7-9be3-ea9467cb5485\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.010665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2mqj\" (UniqueName: \"kubernetes.io/projected/008d7fd6-b4bf-44bd-b06f-fc3a8787cb66-kube-api-access-k2mqj\") pod \"glance-operator-controller-manager-8bc4775b5-vkwpn\" (UID: \"008d7fd6-b4bf-44bd-b06f-fc3a8787cb66\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.043644 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.066588 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.080659 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.084584 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.085620 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.110464 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dqvwc" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.110704 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.114166 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s99ll\" (UniqueName: \"kubernetes.io/projected/29c17248-6b6c-4ab7-8204-0f5d34a30da5-kube-api-access-s99ll\") pod \"barbican-operator-controller-manager-f7f98cb69-8s74j\" (UID: \"29c17248-6b6c-4ab7-8204-0f5d34a30da5\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.114870 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.120483 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5cjn\" (UniqueName: \"kubernetes.io/projected/ea3d1e11-a06c-4cc4-af77-725fdafb57c8-kube-api-access-l5cjn\") pod \"horizon-operator-controller-manager-679b4759bb-8ftj5\" (UID: \"ea3d1e11-a06c-4cc4-af77-725fdafb57c8\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.120742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsq5b\" (UniqueName: \"kubernetes.io/projected/f38cae68-c345-48f7-9be3-ea9467cb5485-kube-api-access-jsq5b\") pod \"cinder-operator-controller-manager-859cd486d-rpc2r\" (UID: \"f38cae68-c345-48f7-9be3-ea9467cb5485\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.120922 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2mqj\" (UniqueName: \"kubernetes.io/projected/008d7fd6-b4bf-44bd-b06f-fc3a8787cb66-kube-api-access-k2mqj\") pod \"glance-operator-controller-manager-8bc4775b5-vkwpn\" (UID: \"008d7fd6-b4bf-44bd-b06f-fc3a8787cb66\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.121435 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bhnh\" (UniqueName: \"kubernetes.io/projected/a4cba4a2-dc1b-485e-b141-a4d7f82176ac-kube-api-access-7bhnh\") pod \"designate-operator-controller-manager-77fb7bcf5b-lhnzv\" (UID: \"a4cba4a2-dc1b-485e-b141-a4d7f82176ac\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.121850 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkr6n\" (UniqueName: \"kubernetes.io/projected/3b13bc76-0bcb-48f3-9e18-f04720087325-kube-api-access-pkr6n\") pod \"heat-operator-controller-manager-5b4fc86755-hkxs6\" (UID: \"3b13bc76-0bcb-48f3-9e18-f04720087325\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.154148 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.166864 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.168068 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-dllpc" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.178638 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bhnh\" (UniqueName: \"kubernetes.io/projected/a4cba4a2-dc1b-485e-b141-a4d7f82176ac-kube-api-access-7bhnh\") pod \"designate-operator-controller-manager-77fb7bcf5b-lhnzv\" (UID: \"a4cba4a2-dc1b-485e-b141-a4d7f82176ac\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.181139 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5cjn\" (UniqueName: \"kubernetes.io/projected/ea3d1e11-a06c-4cc4-af77-725fdafb57c8-kube-api-access-l5cjn\") pod \"horizon-operator-controller-manager-679b4759bb-8ftj5\" (UID: \"ea3d1e11-a06c-4cc4-af77-725fdafb57c8\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.188907 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2mqj\" (UniqueName: \"kubernetes.io/projected/008d7fd6-b4bf-44bd-b06f-fc3a8787cb66-kube-api-access-k2mqj\") pod \"glance-operator-controller-manager-8bc4775b5-vkwpn\" (UID: \"008d7fd6-b4bf-44bd-b06f-fc3a8787cb66\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.190326 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkr6n\" (UniqueName: \"kubernetes.io/projected/3b13bc76-0bcb-48f3-9e18-f04720087325-kube-api-access-pkr6n\") pod \"heat-operator-controller-manager-5b4fc86755-hkxs6\" (UID: \"3b13bc76-0bcb-48f3-9e18-f04720087325\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.191107 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.191562 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s99ll\" (UniqueName: \"kubernetes.io/projected/29c17248-6b6c-4ab7-8204-0f5d34a30da5-kube-api-access-s99ll\") pod \"barbican-operator-controller-manager-f7f98cb69-8s74j\" (UID: \"29c17248-6b6c-4ab7-8204-0f5d34a30da5\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.192099 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsq5b\" (UniqueName: \"kubernetes.io/projected/f38cae68-c345-48f7-9be3-ea9467cb5485-kube-api-access-jsq5b\") pod \"cinder-operator-controller-manager-859cd486d-rpc2r\" (UID: \"f38cae68-c345-48f7-9be3-ea9467cb5485\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.193683 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.195653 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.198107 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.204414 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.205802 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.207560 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.208789 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.220085 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6vqh9" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.220349 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-682sn" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.220487 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-scpjr" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.221323 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.226837 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j7cf\" (UniqueName: \"kubernetes.io/projected/ec701ff9-8d7e-43ef-8887-bafe3f09deba-kube-api-access-6j7cf\") pod \"infra-operator-controller-manager-7d9c7d9477-g7cnm\" (UID: \"ec701ff9-8d7e-43ef-8887-bafe3f09deba\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.226946 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec701ff9-8d7e-43ef-8887-bafe3f09deba-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-g7cnm\" (UID: \"ec701ff9-8d7e-43ef-8887-bafe3f09deba\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.227943 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.232614 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.232965 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.233623 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.235666 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xshjk" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.246142 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.254951 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.268654 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.269691 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.274157 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-dwpd6" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.275634 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.279581 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.312217 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.312711 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.313702 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.315229 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.315746 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-5pptr" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.331118 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdttv\" (UniqueName: \"kubernetes.io/projected/5d3c4b15-3e62-4fe1-ba6c-37100873dc7e-kube-api-access-tdttv\") pod \"manila-operator-controller-manager-b7cf8cb5f-594nq\" (UID: \"5d3c4b15-3e62-4fe1-ba6c-37100873dc7e\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.331172 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec701ff9-8d7e-43ef-8887-bafe3f09deba-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-g7cnm\" (UID: \"ec701ff9-8d7e-43ef-8887-bafe3f09deba\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.331200 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzbdc\" (UniqueName: \"kubernetes.io/projected/1b8d1e87-64b4-4462-a46d-822489fa80f7-kube-api-access-dzbdc\") pod \"keystone-operator-controller-manager-59d7dc95cf-2sn8f\" (UID: \"1b8d1e87-64b4-4462-a46d-822489fa80f7\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.331234 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhl4m\" (UniqueName: \"kubernetes.io/projected/8622909b-a085-4553-8bbc-9a33d5f6df74-kube-api-access-lhl4m\") pod \"ironic-operator-controller-manager-6f589bc7f7-qbqvs\" (UID: \"8622909b-a085-4553-8bbc-9a33d5f6df74\") " pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.331270 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnrff\" (UniqueName: \"kubernetes.io/projected/7dccabec-591d-4737-8977-1aa8b6fd5907-kube-api-access-jnrff\") pod \"mariadb-operator-controller-manager-67bf5bb885-6grkh\" (UID: \"7dccabec-591d-4737-8977-1aa8b6fd5907\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.331297 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m825\" (UniqueName: \"kubernetes.io/projected/08f0bef7-63c6-4118-a5f3-953efc2e638c-kube-api-access-7m825\") pod \"neutron-operator-controller-manager-6b96467f46-vtdfz\" (UID: \"08f0bef7-63c6-4118-a5f3-953efc2e638c\") " pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.331320 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j7cf\" (UniqueName: \"kubernetes.io/projected/ec701ff9-8d7e-43ef-8887-bafe3f09deba-kube-api-access-6j7cf\") pod \"infra-operator-controller-manager-7d9c7d9477-g7cnm\" (UID: \"ec701ff9-8d7e-43ef-8887-bafe3f09deba\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:51:47 crc kubenswrapper[4763]: E0930 13:51:47.331576 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 13:51:47 crc kubenswrapper[4763]: E0930 13:51:47.331682 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec701ff9-8d7e-43ef-8887-bafe3f09deba-cert podName:ec701ff9-8d7e-43ef-8887-bafe3f09deba nodeName:}" failed. No retries permitted until 2025-09-30 13:51:47.831655364 +0000 UTC m=+979.970215649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec701ff9-8d7e-43ef-8887-bafe3f09deba-cert") pod "infra-operator-controller-manager-7d9c7d9477-g7cnm" (UID: "ec701ff9-8d7e-43ef-8887-bafe3f09deba") : secret "infra-operator-webhook-server-cert" not found Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.331892 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.332123 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.350256 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.351260 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.353246 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.353696 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pjhlq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.356722 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.357967 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.360208 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.361180 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.361515 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-d4vgr" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.380214 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-dbcqx" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.383695 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j7cf\" (UniqueName: \"kubernetes.io/projected/ec701ff9-8d7e-43ef-8887-bafe3f09deba-kube-api-access-6j7cf\") pod \"infra-operator-controller-manager-7d9c7d9477-g7cnm\" (UID: \"ec701ff9-8d7e-43ef-8887-bafe3f09deba\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.406256 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.427370 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.437402 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247vx\" (UniqueName: \"kubernetes.io/projected/10a71b21-16cb-4064-b639-fbfc2893812a-kube-api-access-247vx\") pod \"octavia-operator-controller-manager-6fb7d6b8bf-6mp69\" (UID: \"10a71b21-16cb-4064-b639-fbfc2893812a\") " pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.437500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g46m\" (UniqueName: \"kubernetes.io/projected/4addb186-b77b-4a86-85fc-87604ccb3c09-kube-api-access-8g46m\") pod \"nova-operator-controller-manager-79f9fc9fd8-22fzq\" (UID: \"4addb186-b77b-4a86-85fc-87604ccb3c09\") " pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.437535 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzbdc\" (UniqueName: \"kubernetes.io/projected/1b8d1e87-64b4-4462-a46d-822489fa80f7-kube-api-access-dzbdc\") pod \"keystone-operator-controller-manager-59d7dc95cf-2sn8f\" (UID: \"1b8d1e87-64b4-4462-a46d-822489fa80f7\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.437649 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhl4m\" (UniqueName: \"kubernetes.io/projected/8622909b-a085-4553-8bbc-9a33d5f6df74-kube-api-access-lhl4m\") pod \"ironic-operator-controller-manager-6f589bc7f7-qbqvs\" (UID: \"8622909b-a085-4553-8bbc-9a33d5f6df74\") " pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.437771 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnrff\" (UniqueName: \"kubernetes.io/projected/7dccabec-591d-4737-8977-1aa8b6fd5907-kube-api-access-jnrff\") pod \"mariadb-operator-controller-manager-67bf5bb885-6grkh\" (UID: \"7dccabec-591d-4737-8977-1aa8b6fd5907\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.437867 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m825\" (UniqueName: \"kubernetes.io/projected/08f0bef7-63c6-4118-a5f3-953efc2e638c-kube-api-access-7m825\") pod \"neutron-operator-controller-manager-6b96467f46-vtdfz\" (UID: \"08f0bef7-63c6-4118-a5f3-953efc2e638c\") " pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.437958 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdttv\" (UniqueName: \"kubernetes.io/projected/5d3c4b15-3e62-4fe1-ba6c-37100873dc7e-kube-api-access-tdttv\") pod \"manila-operator-controller-manager-b7cf8cb5f-594nq\" (UID: \"5d3c4b15-3e62-4fe1-ba6c-37100873dc7e\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.490686 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.491925 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.493832 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-snd88" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.519733 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.520833 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.530013 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-grvw8" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.530181 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.535583 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.538733 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfzrw\" (UniqueName: \"kubernetes.io/projected/5bf8cc8d-90d0-4687-bfca-fd75f8d1c308-kube-api-access-lfzrw\") pod \"placement-operator-controller-manager-598c4c8547-jkggm\" (UID: \"5bf8cc8d-90d0-4687-bfca-fd75f8d1c308\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.538855 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247vx\" (UniqueName: \"kubernetes.io/projected/10a71b21-16cb-4064-b639-fbfc2893812a-kube-api-access-247vx\") pod \"octavia-operator-controller-manager-6fb7d6b8bf-6mp69\" (UID: \"10a71b21-16cb-4064-b639-fbfc2893812a\") " pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.538949 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g46m\" (UniqueName: \"kubernetes.io/projected/4addb186-b77b-4a86-85fc-87604ccb3c09-kube-api-access-8g46m\") pod \"nova-operator-controller-manager-79f9fc9fd8-22fzq\" (UID: \"4addb186-b77b-4a86-85fc-87604ccb3c09\") " pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.539042 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5acc6630-bf7d-4acf-b724-60e722171e8f-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds\" (UID: \"5acc6630-bf7d-4acf-b724-60e722171e8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.539203 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftjqt\" (UniqueName: \"kubernetes.io/projected/55edd3bf-c291-4659-a6dc-1c348d04799c-kube-api-access-ftjqt\") pod \"ovn-operator-controller-manager-84c745747f-r5sn8\" (UID: \"55edd3bf-c291-4659-a6dc-1c348d04799c\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.539289 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsfck\" (UniqueName: \"kubernetes.io/projected/5acc6630-bf7d-4acf-b724-60e722171e8f-kube-api-access-lsfck\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds\" (UID: \"5acc6630-bf7d-4acf-b724-60e722171e8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.547349 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.548741 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.549706 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.551761 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-fhc4b" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.560504 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.596679 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhl4m\" (UniqueName: \"kubernetes.io/projected/8622909b-a085-4553-8bbc-9a33d5f6df74-kube-api-access-lhl4m\") pod \"ironic-operator-controller-manager-6f589bc7f7-qbqvs\" (UID: \"8622909b-a085-4553-8bbc-9a33d5f6df74\") " pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.597454 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m825\" (UniqueName: \"kubernetes.io/projected/08f0bef7-63c6-4118-a5f3-953efc2e638c-kube-api-access-7m825\") pod \"neutron-operator-controller-manager-6b96467f46-vtdfz\" (UID: \"08f0bef7-63c6-4118-a5f3-953efc2e638c\") " pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.598514 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzbdc\" (UniqueName: \"kubernetes.io/projected/1b8d1e87-64b4-4462-a46d-822489fa80f7-kube-api-access-dzbdc\") pod \"keystone-operator-controller-manager-59d7dc95cf-2sn8f\" (UID: \"1b8d1e87-64b4-4462-a46d-822489fa80f7\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.599672 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnrff\" (UniqueName: \"kubernetes.io/projected/7dccabec-591d-4737-8977-1aa8b6fd5907-kube-api-access-jnrff\") pod \"mariadb-operator-controller-manager-67bf5bb885-6grkh\" (UID: \"7dccabec-591d-4737-8977-1aa8b6fd5907\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.599957 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdttv\" (UniqueName: \"kubernetes.io/projected/5d3c4b15-3e62-4fe1-ba6c-37100873dc7e-kube-api-access-tdttv\") pod \"manila-operator-controller-manager-b7cf8cb5f-594nq\" (UID: \"5d3c4b15-3e62-4fe1-ba6c-37100873dc7e\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.607624 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g46m\" (UniqueName: \"kubernetes.io/projected/4addb186-b77b-4a86-85fc-87604ccb3c09-kube-api-access-8g46m\") pod \"nova-operator-controller-manager-79f9fc9fd8-22fzq\" (UID: \"4addb186-b77b-4a86-85fc-87604ccb3c09\") " pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.631083 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247vx\" (UniqueName: \"kubernetes.io/projected/10a71b21-16cb-4064-b639-fbfc2893812a-kube-api-access-247vx\") pod \"octavia-operator-controller-manager-6fb7d6b8bf-6mp69\" (UID: \"10a71b21-16cb-4064-b639-fbfc2893812a\") " pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.641232 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5acc6630-bf7d-4acf-b724-60e722171e8f-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds\" (UID: \"5acc6630-bf7d-4acf-b724-60e722171e8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.641277 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrzd\" (UniqueName: \"kubernetes.io/projected/02b9b96d-a908-4964-ac52-b5b8f73ffbef-kube-api-access-rvrzd\") pod \"test-operator-controller-manager-6bb97fcf96-v8f7j\" (UID: \"02b9b96d-a908-4964-ac52-b5b8f73ffbef\") " pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.641332 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftjqt\" (UniqueName: \"kubernetes.io/projected/55edd3bf-c291-4659-a6dc-1c348d04799c-kube-api-access-ftjqt\") pod \"ovn-operator-controller-manager-84c745747f-r5sn8\" (UID: \"55edd3bf-c291-4659-a6dc-1c348d04799c\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.641360 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsfck\" (UniqueName: \"kubernetes.io/projected/5acc6630-bf7d-4acf-b724-60e722171e8f-kube-api-access-lsfck\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds\" (UID: \"5acc6630-bf7d-4acf-b724-60e722171e8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.641433 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9zjf\" (UniqueName: \"kubernetes.io/projected/14beb357-7d8b-4cbd-bda6-56eddcd765b0-kube-api-access-f9zjf\") pod \"telemetry-operator-controller-manager-cb66d6b59-p95zl\" (UID: \"14beb357-7d8b-4cbd-bda6-56eddcd765b0\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.641463 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2d9d\" (UniqueName: \"kubernetes.io/projected/6a1bd649-6042-4f29-b6ab-cb3bcfcdca51-kube-api-access-n2d9d\") pod \"swift-operator-controller-manager-657c6b68c7-wnh7g\" (UID: \"6a1bd649-6042-4f29-b6ab-cb3bcfcdca51\") " pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.641485 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfzrw\" (UniqueName: \"kubernetes.io/projected/5bf8cc8d-90d0-4687-bfca-fd75f8d1c308-kube-api-access-lfzrw\") pod \"placement-operator-controller-manager-598c4c8547-jkggm\" (UID: \"5bf8cc8d-90d0-4687-bfca-fd75f8d1c308\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" Sep 30 13:51:47 crc kubenswrapper[4763]: E0930 13:51:47.672116 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 13:51:47 crc kubenswrapper[4763]: E0930 13:51:47.672212 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5acc6630-bf7d-4acf-b724-60e722171e8f-cert podName:5acc6630-bf7d-4acf-b724-60e722171e8f nodeName:}" failed. No retries permitted until 2025-09-30 13:51:48.172172485 +0000 UTC m=+980.310732770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5acc6630-bf7d-4acf-b724-60e722171e8f-cert") pod "openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" (UID: "5acc6630-bf7d-4acf-b724-60e722171e8f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.674746 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.675635 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.710536 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.717062 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.790308 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9zjf\" (UniqueName: \"kubernetes.io/projected/14beb357-7d8b-4cbd-bda6-56eddcd765b0-kube-api-access-f9zjf\") pod \"telemetry-operator-controller-manager-cb66d6b59-p95zl\" (UID: \"14beb357-7d8b-4cbd-bda6-56eddcd765b0\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.798577 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.801084 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.813436 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.821417 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2d9d\" (UniqueName: \"kubernetes.io/projected/6a1bd649-6042-4f29-b6ab-cb3bcfcdca51-kube-api-access-n2d9d\") pod \"swift-operator-controller-manager-657c6b68c7-wnh7g\" (UID: \"6a1bd649-6042-4f29-b6ab-cb3bcfcdca51\") " pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.821481 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfzrw\" (UniqueName: \"kubernetes.io/projected/5bf8cc8d-90d0-4687-bfca-fd75f8d1c308-kube-api-access-lfzrw\") pod \"placement-operator-controller-manager-598c4c8547-jkggm\" (UID: \"5bf8cc8d-90d0-4687-bfca-fd75f8d1c308\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.821732 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-xqfh9" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.823174 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrzd\" (UniqueName: \"kubernetes.io/projected/02b9b96d-a908-4964-ac52-b5b8f73ffbef-kube-api-access-rvrzd\") pod \"test-operator-controller-manager-6bb97fcf96-v8f7j\" (UID: \"02b9b96d-a908-4964-ac52-b5b8f73ffbef\") " pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.823345 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.827189 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.836005 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.837115 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.836997 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftjqt\" (UniqueName: \"kubernetes.io/projected/55edd3bf-c291-4659-a6dc-1c348d04799c-kube-api-access-ftjqt\") pod \"ovn-operator-controller-manager-84c745747f-r5sn8\" (UID: \"55edd3bf-c291-4659-a6dc-1c348d04799c\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.839242 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsfck\" (UniqueName: \"kubernetes.io/projected/5acc6630-bf7d-4acf-b724-60e722171e8f-kube-api-access-lsfck\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds\" (UID: \"5acc6630-bf7d-4acf-b724-60e722171e8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.840010 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9zjf\" (UniqueName: \"kubernetes.io/projected/14beb357-7d8b-4cbd-bda6-56eddcd765b0-kube-api-access-f9zjf\") pod \"telemetry-operator-controller-manager-cb66d6b59-p95zl\" (UID: \"14beb357-7d8b-4cbd-bda6-56eddcd765b0\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.851292 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2d9d\" (UniqueName: \"kubernetes.io/projected/6a1bd649-6042-4f29-b6ab-cb3bcfcdca51-kube-api-access-n2d9d\") pod \"swift-operator-controller-manager-657c6b68c7-wnh7g\" (UID: \"6a1bd649-6042-4f29-b6ab-cb3bcfcdca51\") " pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.863398 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrzd\" (UniqueName: \"kubernetes.io/projected/02b9b96d-a908-4964-ac52-b5b8f73ffbef-kube-api-access-rvrzd\") pod \"test-operator-controller-manager-6bb97fcf96-v8f7j\" (UID: \"02b9b96d-a908-4964-ac52-b5b8f73ffbef\") " pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.868585 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.871441 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.872396 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.873527 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.884387 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vd9d5" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.886175 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.923733 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r"] Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.925026 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.927705 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-h89sc" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.928576 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b604b2-49b7-4471-9e02-161a0caebc4b-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-m2cnh\" (UID: \"d8b604b2-49b7-4471-9e02-161a0caebc4b\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.928652 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xrc9\" (UniqueName: \"kubernetes.io/projected/9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd-kube-api-access-7xrc9\") pod \"watcher-operator-controller-manager-75756dd4d9-72d45\" (UID: \"9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd\") " pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.928715 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f68n5\" (UniqueName: \"kubernetes.io/projected/d8b604b2-49b7-4471-9e02-161a0caebc4b-kube-api-access-f68n5\") pod \"openstack-operator-controller-manager-7b7bb8bd67-m2cnh\" (UID: \"d8b604b2-49b7-4471-9e02-161a0caebc4b\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.928797 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec701ff9-8d7e-43ef-8887-bafe3f09deba-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-g7cnm\" (UID: \"ec701ff9-8d7e-43ef-8887-bafe3f09deba\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:51:47 crc kubenswrapper[4763]: E0930 13:51:47.928947 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 13:51:47 crc kubenswrapper[4763]: E0930 13:51:47.929030 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec701ff9-8d7e-43ef-8887-bafe3f09deba-cert podName:ec701ff9-8d7e-43ef-8887-bafe3f09deba nodeName:}" failed. No retries permitted until 2025-09-30 13:51:48.929005504 +0000 UTC m=+981.067565789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec701ff9-8d7e-43ef-8887-bafe3f09deba-cert") pod "infra-operator-controller-manager-7d9c7d9477-g7cnm" (UID: "ec701ff9-8d7e-43ef-8887-bafe3f09deba") : secret "infra-operator-webhook-server-cert" not found Sep 30 13:51:47 crc kubenswrapper[4763]: I0930 13:51:47.943385 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r"] Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.020375 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j"] Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.030026 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xrc9\" (UniqueName: \"kubernetes.io/projected/9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd-kube-api-access-7xrc9\") pod \"watcher-operator-controller-manager-75756dd4d9-72d45\" (UID: \"9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd\") " pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.030070 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xh8s\" (UniqueName: \"kubernetes.io/projected/6efd5b9a-3e7d-4913-930d-3fe4452551b6-kube-api-access-5xh8s\") pod \"rabbitmq-cluster-operator-manager-79d8469568-fdr5r\" (UID: \"6efd5b9a-3e7d-4913-930d-3fe4452551b6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.030110 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f68n5\" (UniqueName: \"kubernetes.io/projected/d8b604b2-49b7-4471-9e02-161a0caebc4b-kube-api-access-f68n5\") pod \"openstack-operator-controller-manager-7b7bb8bd67-m2cnh\" (UID: \"d8b604b2-49b7-4471-9e02-161a0caebc4b\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.030198 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b604b2-49b7-4471-9e02-161a0caebc4b-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-m2cnh\" (UID: \"d8b604b2-49b7-4471-9e02-161a0caebc4b\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:48 crc kubenswrapper[4763]: E0930 13:51:48.030312 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 13:51:48 crc kubenswrapper[4763]: E0930 13:51:48.030357 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8b604b2-49b7-4471-9e02-161a0caebc4b-cert podName:d8b604b2-49b7-4471-9e02-161a0caebc4b nodeName:}" failed. No retries permitted until 2025-09-30 13:51:48.530343489 +0000 UTC m=+980.668903774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8b604b2-49b7-4471-9e02-161a0caebc4b-cert") pod "openstack-operator-controller-manager-7b7bb8bd67-m2cnh" (UID: "d8b604b2-49b7-4471-9e02-161a0caebc4b") : secret "webhook-server-cert" not found Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.040220 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.047477 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xrc9\" (UniqueName: \"kubernetes.io/projected/9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd-kube-api-access-7xrc9\") pod \"watcher-operator-controller-manager-75756dd4d9-72d45\" (UID: \"9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd\") " pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.065364 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.066859 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f68n5\" (UniqueName: \"kubernetes.io/projected/d8b604b2-49b7-4471-9e02-161a0caebc4b-kube-api-access-f68n5\") pod \"openstack-operator-controller-manager-7b7bb8bd67-m2cnh\" (UID: \"d8b604b2-49b7-4471-9e02-161a0caebc4b\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.084511 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.096445 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.131247 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xh8s\" (UniqueName: \"kubernetes.io/projected/6efd5b9a-3e7d-4913-930d-3fe4452551b6-kube-api-access-5xh8s\") pod \"rabbitmq-cluster-operator-manager-79d8469568-fdr5r\" (UID: \"6efd5b9a-3e7d-4913-930d-3fe4452551b6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.158507 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xh8s\" (UniqueName: \"kubernetes.io/projected/6efd5b9a-3e7d-4913-930d-3fe4452551b6-kube-api-access-5xh8s\") pod \"rabbitmq-cluster-operator-manager-79d8469568-fdr5r\" (UID: \"6efd5b9a-3e7d-4913-930d-3fe4452551b6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.232729 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5acc6630-bf7d-4acf-b724-60e722171e8f-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds\" (UID: \"5acc6630-bf7d-4acf-b724-60e722171e8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.272469 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5acc6630-bf7d-4acf-b724-60e722171e8f-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds\" (UID: \"5acc6630-bf7d-4acf-b724-60e722171e8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.295069 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" event={"ID":"29c17248-6b6c-4ab7-8204-0f5d34a30da5","Type":"ContainerStarted","Data":"dfb07ab88d5f0416c505472a12dfdeb37f906f0e36f52e04c858d8937fb44b54"} Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.436056 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.537176 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b604b2-49b7-4471-9e02-161a0caebc4b-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-m2cnh\" (UID: \"d8b604b2-49b7-4471-9e02-161a0caebc4b\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:48 crc kubenswrapper[4763]: E0930 13:51:48.537406 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 13:51:48 crc kubenswrapper[4763]: E0930 13:51:48.537473 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8b604b2-49b7-4471-9e02-161a0caebc4b-cert podName:d8b604b2-49b7-4471-9e02-161a0caebc4b nodeName:}" failed. No retries permitted until 2025-09-30 13:51:49.537457283 +0000 UTC m=+981.676017578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8b604b2-49b7-4471-9e02-161a0caebc4b-cert") pod "openstack-operator-controller-manager-7b7bb8bd67-m2cnh" (UID: "d8b604b2-49b7-4471-9e02-161a0caebc4b") : secret "webhook-server-cert" not found Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.556978 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.654573 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv"] Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.828639 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn"] Sep 30 13:51:48 crc kubenswrapper[4763]: W0930 13:51:48.836896 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod008d7fd6_b4bf_44bd_b06f_fc3a8787cb66.slice/crio-c57e1eca8bf953bab0fc9674e14c76c02464039b3603489f5a1c8e40c8975123 WatchSource:0}: Error finding container c57e1eca8bf953bab0fc9674e14c76c02464039b3603489f5a1c8e40c8975123: Status 404 returned error can't find the container with id c57e1eca8bf953bab0fc9674e14c76c02464039b3603489f5a1c8e40c8975123 Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.854668 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5"] Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.873635 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r"] Sep 30 13:51:48 crc kubenswrapper[4763]: W0930 13:51:48.885006 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38cae68_c345_48f7_9be3_ea9467cb5485.slice/crio-a259f0aa7ac081fb389c2b5965a0fd1f7afe6048da13d97edb5737555cb8eabb WatchSource:0}: Error finding container a259f0aa7ac081fb389c2b5965a0fd1f7afe6048da13d97edb5737555cb8eabb: Status 404 returned error can't find the container with id a259f0aa7ac081fb389c2b5965a0fd1f7afe6048da13d97edb5737555cb8eabb Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.944710 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec701ff9-8d7e-43ef-8887-bafe3f09deba-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-g7cnm\" (UID: \"ec701ff9-8d7e-43ef-8887-bafe3f09deba\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:51:48 crc kubenswrapper[4763]: I0930 13:51:48.960295 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec701ff9-8d7e-43ef-8887-bafe3f09deba-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-g7cnm\" (UID: \"ec701ff9-8d7e-43ef-8887-bafe3f09deba\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.167353 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j"] Sep 30 13:51:49 crc kubenswrapper[4763]: W0930 13:51:49.176741 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02b9b96d_a908_4964_ac52_b5b8f73ffbef.slice/crio-bb3f7eb53dddc5fb3db3153a5754ed377c39e43649c1c0afa399de72ffc0eef0 WatchSource:0}: Error finding container bb3f7eb53dddc5fb3db3153a5754ed377c39e43649c1c0afa399de72ffc0eef0: Status 404 returned error can't find the container with id bb3f7eb53dddc5fb3db3153a5754ed377c39e43649c1c0afa399de72ffc0eef0 Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.197992 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.232231 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.251657 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.279733 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.291556 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.308502 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.311074 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" event={"ID":"ea3d1e11-a06c-4cc4-af77-725fdafb57c8","Type":"ContainerStarted","Data":"f4dc60062e758cae218dfe333f746e31377dc194c6040439e62e276baa318152"} Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.316321 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.317320 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" event={"ID":"008d7fd6-b4bf-44bd-b06f-fc3a8787cb66","Type":"ContainerStarted","Data":"c57e1eca8bf953bab0fc9674e14c76c02464039b3603489f5a1c8e40c8975123"} Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.319759 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" event={"ID":"a4cba4a2-dc1b-485e-b141-a4d7f82176ac","Type":"ContainerStarted","Data":"ba688cbf0ea48984be99368ac14e1a85414571fce2225f457cca000f59a78b94"} Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.321498 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" event={"ID":"7dccabec-591d-4737-8977-1aa8b6fd5907","Type":"ContainerStarted","Data":"134dfba76dfcb5c2c0799cca63a526b955f66f4cfc80f91c50839038ecac3fa6"} Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.322730 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" event={"ID":"1b8d1e87-64b4-4462-a46d-822489fa80f7","Type":"ContainerStarted","Data":"e844de6d716075e57176d28de557b1d3315e4a1270c965d0e0c135a9fa200291"} Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.323848 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" event={"ID":"02b9b96d-a908-4964-ac52-b5b8f73ffbef","Type":"ContainerStarted","Data":"bb3f7eb53dddc5fb3db3153a5754ed377c39e43649c1c0afa399de72ffc0eef0"} Sep 30 13:51:49 crc kubenswrapper[4763]: W0930 13:51:49.324113 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6efd5b9a_3e7d_4913_930d_3fe4452551b6.slice/crio-424df5da21edc8bc729834f1848775d11f6943f3e4a7bf58e840d6a5dcb57883 WatchSource:0}: Error finding container 424df5da21edc8bc729834f1848775d11f6943f3e4a7bf58e840d6a5dcb57883: Status 404 returned error can't find the container with id 424df5da21edc8bc729834f1848775d11f6943f3e4a7bf58e840d6a5dcb57883 Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.325749 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" event={"ID":"f38cae68-c345-48f7-9be3-ea9467cb5485","Type":"ContainerStarted","Data":"a259f0aa7ac081fb389c2b5965a0fd1f7afe6048da13d97edb5737555cb8eabb"} Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.329267 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.337376 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl"] Sep 30 13:51:49 crc kubenswrapper[4763]: W0930 13:51:49.342858 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55edd3bf_c291_4659_a6dc_1c348d04799c.slice/crio-d8c0878213c16373ca9d93567b1e08822d8a5972a551ba27e1cec62a554ee294 WatchSource:0}: Error finding container d8c0878213c16373ca9d93567b1e08822d8a5972a551ba27e1cec62a554ee294: Status 404 returned error can't find the container with id d8c0878213c16373ca9d93567b1e08822d8a5972a551ba27e1cec62a554ee294 Sep 30 13:51:49 crc kubenswrapper[4763]: W0930 13:51:49.345164 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b13bc76_0bcb_48f3_9e18_f04720087325.slice/crio-6a3600bcf3c5364b73e0f5bad879ed410b5699ce9241098ba78e259a4ba61bd7 WatchSource:0}: Error finding container 6a3600bcf3c5364b73e0f5bad879ed410b5699ce9241098ba78e259a4ba61bd7: Status 404 returned error can't find the container with id 6a3600bcf3c5364b73e0f5bad879ed410b5699ce9241098ba78e259a4ba61bd7 Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.348229 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r"] Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.354945 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lfzrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-598c4c8547-jkggm_openstack-operators(5bf8cc8d-90d0-4687-bfca-fd75f8d1c308): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.357054 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7m825,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6b96467f46-vtdfz_openstack-operators(08f0bef7-63c6-4118-a5f3-953efc2e638c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 13:51:49 crc kubenswrapper[4763]: W0930 13:51:49.361105 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d3c4b15_3e62_4fe1_ba6c_37100873dc7e.slice/crio-dbc096303d122ada2802e93bb527b14f09dadc4b094cef0e2d8b9980aefbe844 WatchSource:0}: Error finding container dbc096303d122ada2802e93bb527b14f09dadc4b094cef0e2d8b9980aefbe844: Status 404 returned error can't find the container with id dbc096303d122ada2802e93bb527b14f09dadc4b094cef0e2d8b9980aefbe844 Sep 30 13:51:49 crc kubenswrapper[4763]: W0930 13:51:49.371178 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c68c2c4_d0be_4bf4_a83c_975d1eb9a1dd.slice/crio-aafe9cb016ec62f87ec6a355fe51414a1f64f91198dc2788344feca6ae3e66ed WatchSource:0}: Error finding container aafe9cb016ec62f87ec6a355fe51414a1f64f91198dc2788344feca6ae3e66ed: Status 404 returned error can't find the container with id aafe9cb016ec62f87ec6a355fe51414a1f64f91198dc2788344feca6ae3e66ed Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.371808 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tdttv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-b7cf8cb5f-594nq_openstack-operators(5d3c4b15-3e62-4fe1-ba6c-37100873dc7e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.373419 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.378396 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6"] Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.379009 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7xrc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75756dd4d9-72d45_openstack-operators(9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.379261 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n2d9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-657c6b68c7-wnh7g_openstack-operators(6a1bd649-6042-4f29-b6ab-cb3bcfcdca51): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.379459 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-247vx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6fb7d6b8bf-6mp69_openstack-operators(10a71b21-16cb-4064-b639-fbfc2893812a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.379849 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:89f9e06c633ae852be8d3e3ca581def0a6e9a5b38c0d519f656976c7414b6b97,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:56f155abc1b8734e4a79c7306ba38caf8d2881625f37d2f9c5a5763fa4db7e02,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:29c8cd4f2d853f512e2ecd44f522f28c3aac046a72733365aa5e91667041d62e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:ed896681f0d9720f56bbcb0b7a4f3626ed397e89af919604ca68b42b7b598859,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:712e1c932a90ef5e3c3ee5d5aea591a377da8c4af604ebd8ec399869a61dfbef,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:10fd8489a5bf6f1d781e9226de68356132db78b62269e69d632748cb08fae725,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:73fd28af83ea96cc920d26dba6105ee59f0824234527949884e6ca55b71d7533,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:8b3a90516ba0695cf3198a7b101da770c30c8100cb79f8088b5729e6a50ddd6d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:6d42bcf65422d2de9cd807feb3e8b005de10084b4b8eb340c8a9045644ae7aaa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:32a25ac44706b73bff04a89514177b1efd675f0442b295e225f0020555ca6350,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:b19043eac7c653e00da8da9418ae378fdd29698adb1adb4bf5ae7cfc03ba5538,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:c486e00b36ea7698d6a4cd9048a759bad5a8286e4949bbd1f82c3ddb70600b9b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:ef2727f0300fbf3bf15d8ddc409d0fd63e4aac9dd64c86459bd6ff64fc6b9534,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:329aac65ba00c3cf43bb1d5fac8818752f01de90b47719e2a84db4e2fe083292,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:6ce73885ac1ee7c69468efc448eff5deae46502812c5e3d099f771e1cc03345f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:282cc0fcdbb8a688dd62a2499480aae4a36b620f2160d51e6c8269e6cc32d5fc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:d98c0c9d3bdd84daf4b98d45b8bbe2e67a633491897dda7167664a5fa1f0f26e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:4ad1d36fe1c8992e43910fc2d566b991fd73f9b82b1ab860c66858448ff82c00,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:92789eab1b8a91807a5e898cb63478d125ae539eafe63c96049100c6ddeadb04,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:ee9832268e0df5d62c50c5ce171e9ef72a035aa74c718cfbf482e34426d8d15e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:07b4f96f24f32224c13613f85173f9fcc3092b8797ffa47519403d124bfe4c15,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:3a873c95bcb7ae8bd24ff1eb5fe89ac5272a41a3345a7b41d55419b5d66b70e7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:388dbae2f1aae2720e919cc24d10cd577b73b4e4ef7abdc34287bcb8d27ff98f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:d4c1b2496868da3dcca9f4bda0834fcc58d23c21d8ce3c42a68205d02039c487,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:c4414cc2680fb1bacbf99261f759f4ef7401fb2e4953140270bffdab8e002f22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:b9b950a656f1456b3143872c492b0987bf4a9e23bc7c59d843cf50099667b368,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:afd5d6822b86ea0930b2011fede834bb24495995d7baac03363ab61d89f07a22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:665d7a25dfc959ec5448d5ba6b430792ebde1be1580ea6809e9b3b4f94184b3f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:499c6d82390ee2dbb91628d2e42671406372fb603d697685a04145cf6dd8d0ab,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:da2736bc98bfe340e86234523d4c00220f6f79add271900981cf4ad9f4c5ee51,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:4df8dad8a5fb4805a0424cbc0b8df666b9a06b76c64f26e186f3b9e8efe6cd95,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:65c16453b5b7bb113646ffce0be26138e89eecbf6dd1582cdfe76af7f5dc62cf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:ce968dce2209ec5114772b4b73ed16c0a25988637372f2afbfac080cc6f1e378,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:b7823eaacf55280cdf3f1bede4f40bf49fdbf9ba9f3f5ba64b0abedede601c8f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:605206d967ffaa20156eb07a645654cd3e0f880bb0eefbb2b5e1e749b169f148,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:9470db6caf5102cf37ddb1f137f17b05ef7119f174f4189beb4839ef7f65730c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:34e84da4ae7e5d65931cbefcda84fd8fdc93271ec466adf1a9040b67a3af176a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:b301b17c31e47733a8a232773427ce3cb50433a3aa09d4a5bd998b1aeb5e5530,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:d642c35c0f9d3acf31987c028f1d4d4fdf7b49e1d6cbcd73268c12b3d6e14b86,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:922eb0799ab36a91aa95abe52565dc60db807457dbf8c651b30e06b9e8aebcd4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:cd01e9605ab513458a6813e38d37fbfde1a91388cc5c00962203dbcbdc285e79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:dd35c22b17730cbca8547ea98459f182939462c8dc3465d21335a377018937de,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:0e0e2e48a41d5417f1d6a4407e63d443611b7eacd66e27f561c9eedf3e5a66c5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:735bd24219fdb5f21c31313a5bc685364f45c004fb5e8af634984c147060d4e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:35b5554efae34f2c25a2d274c78bdaecf3d4ce949fa61c692835ee54cdfc6d74,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:01b93ab0d87482b9a1fd46706771974743dea1ca74f5fcc3de4a560f7cfc033b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:87471fbe3ba77b7115096f4fef8f5a9e1468cbd5bf6060c09785a60f9107a717,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:947dcc46173064939cba252d5db34eb6ddd05eb0af7afd762beebe77e9a72c6e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:8498ed720d02ce4e7045f7eb0051b138274cddba9b1e443d11e413da3474d3a3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:2cb054830655a6af5fc6848360618676d24fd9cf15078c0b9855e09d05733eec,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:0f5f8f560cd3b4951f7e8e67ef570575435b4c6915658cbb66f32a201776078b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:7055e8d7b7d72ce697c6077be14c525c019d186002f04765b90a14c82e01cc7c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:d2cd7a21461b4b569d93a63d57761f437cf6bd0847d69a3a65f64d400c7cca6d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:432c0c6f36a5e4e4db394771f7dc72f3bf9e5060dc4220f781d3c5050cc17f0d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:3ff379a74cc15352bfa25605dbb1a5f4250620e8364bf87ed2f3d5c17e6a8b26,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:c67a7bba2fc9351c302369b590473a737bab20d0982d227756fe1fa0bc1c8773,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:50c613d159667a26ba4bfb7aebf157b8db8919c815a866438b1d2700231a508e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:f3d3d7a7c83926a09714199406bfe8070e6be5055cbfbf00aa37f47e1e5e9bc9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:e9b3260907b0e417bb779a7d513a2639734cbbf792e77c61e05e760d06978f4a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:1aa6a76e67f2d91ee45472741238b5d4ab53f9bcb94db678c7ae92e1af28899d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:80b8547cf5821a4eb5461d1ac14edbc700ef03926268af960bf511647de027af,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content@sha256:7086442096db5ceb68e22bcce00688072957fdad07d00d8f18eb0506ad958923,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:bf42dfd2e225818662aa28c4bb23204dc47b2b91127ca0e49b085baa1ea7609d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:bd08ffdb4dcfd436200d846d15b2bdcc14122fa43adfea4c0980a087a18f9e3e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:2d1e733d24df6ca02636374147f801a0ec1509f8db2f9ad8c739b3f2341815fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:c08ba2a0df4cc18e615b25c329e9c74153709b435c032c38502ec78ba297c5fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:b6cdafc7722def5b63ef4f00251e10aca93ef82628b21e88925c3d4b49277316,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:0a0bbe43e3c266dfeb40a09036f76393dc70377b636724c130a29c434f6d6c82,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:7387b628d7cfb3ff349e0df6f11f41ae7fdb0e2d55844944896af02a81ac7cf7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:9a3671dee1752ebe3639a0b16de95d29e779f1629d563e0585d65b9792542fc9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:37cc031749b113c35231066ce9f8ce7ccc83e21808ba92ea1981e72bbc42e80f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:b2782fe02b1438d68308a5847b0628f0971b5bb8bb0a4d20fe15176fa75bd33f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:7118cc3a695fead2a8bab14c8ace018ed7a5ba23ef347bf4ead44219e8467866,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:793a836e17b07b0e0a4e8d3177fd04724e1e058fca275ef434abe60a2e444a79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:713d74dc81859344bdcae68a9f7a954146c3e68cfa819518a58cce9e896298c8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:e39be536015777a1b0df8ac863f354046b2b15fee8482abd37d2fa59d8074208,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:28e209c66bc86354495ac7793f2e66db0e8540485590742ab1b53a7cf24cb4fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:d117753b6cff563084bf771173ea89a2ce00854efdc45447667e5d230c60c363,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:f1aac0a57d83b085c37cf75ce0a56f85b68353b1a88740b64a5858bc93dba36b,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lsfck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds_openstack-operators(5acc6630-bf7d-4acf-b724-60e722171e8f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.383622 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.389366 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.397668 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.407777 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds"] Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.554895 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b604b2-49b7-4471-9e02-161a0caebc4b-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-m2cnh\" (UID: \"d8b604b2-49b7-4471-9e02-161a0caebc4b\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.565682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b604b2-49b7-4471-9e02-161a0caebc4b-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-m2cnh\" (UID: \"d8b604b2-49b7-4471-9e02-161a0caebc4b\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.617939 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:49 crc kubenswrapper[4763]: I0930 13:51:49.642884 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm"] Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.650674 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" podUID="9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd" Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.692529 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" podUID="5d3c4b15-3e62-4fe1-ba6c-37100873dc7e" Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.715038 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" podUID="10a71b21-16cb-4064-b639-fbfc2893812a" Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.733175 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" podUID="5bf8cc8d-90d0-4687-bfca-fd75f8d1c308" Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.773964 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" podUID="5acc6630-bf7d-4acf-b724-60e722171e8f" Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.773999 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" podUID="08f0bef7-63c6-4118-a5f3-953efc2e638c" Sep 30 13:51:49 crc kubenswrapper[4763]: E0930 13:51:49.789310 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" podUID="6a1bd649-6042-4f29-b6ab-cb3bcfcdca51" Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.335140 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh"] Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.430976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" event={"ID":"5d3c4b15-3e62-4fe1-ba6c-37100873dc7e","Type":"ContainerStarted","Data":"c74e5bd0b9f13eb28cdb483f80b55da93698a452134ce7bb33664b7df65ee1d6"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.431025 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" event={"ID":"5d3c4b15-3e62-4fe1-ba6c-37100873dc7e","Type":"ContainerStarted","Data":"dbc096303d122ada2802e93bb527b14f09dadc4b094cef0e2d8b9980aefbe844"} Sep 30 13:51:50 crc kubenswrapper[4763]: E0930 13:51:50.443011 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" podUID="5d3c4b15-3e62-4fe1-ba6c-37100873dc7e" Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.450027 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" event={"ID":"9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd","Type":"ContainerStarted","Data":"6e4ac4cac4e66bdbac9fae672837133547647595955e3bbb86090f66d9cf0618"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.450081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" event={"ID":"9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd","Type":"ContainerStarted","Data":"aafe9cb016ec62f87ec6a355fe51414a1f64f91198dc2788344feca6ae3e66ed"} Sep 30 13:51:50 crc kubenswrapper[4763]: E0930 13:51:50.450877 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" podUID="9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd" Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.453702 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" event={"ID":"5bf8cc8d-90d0-4687-bfca-fd75f8d1c308","Type":"ContainerStarted","Data":"95ff07a476245fcc25c00742350701c54d547c70df07bc72112a15116c7fc2f1"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.453745 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" event={"ID":"5bf8cc8d-90d0-4687-bfca-fd75f8d1c308","Type":"ContainerStarted","Data":"b3f708ac9b672e12ef9609f69841c8f4a4be7f8571b506a8fed18383cb0db4d1"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.455863 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" event={"ID":"14beb357-7d8b-4cbd-bda6-56eddcd765b0","Type":"ContainerStarted","Data":"5fcae3fe17d4c6ac4ee5ef08c26c5ebe918c4ff4e8ea79862274a4ed7a151a45"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.459786 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" event={"ID":"8622909b-a085-4553-8bbc-9a33d5f6df74","Type":"ContainerStarted","Data":"6526fafac47c2d6a1b6ce81a231ab01227152354b75faac97ec2559a315341cb"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.461077 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" event={"ID":"3b13bc76-0bcb-48f3-9e18-f04720087325","Type":"ContainerStarted","Data":"6a3600bcf3c5364b73e0f5bad879ed410b5699ce9241098ba78e259a4ba61bd7"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.462813 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" event={"ID":"4addb186-b77b-4a86-85fc-87604ccb3c09","Type":"ContainerStarted","Data":"fe9e4fba8e59e809df38fcf8542e065e7780bb9ffb64e4ef1e73d7630cd3b264"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.465444 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" event={"ID":"55edd3bf-c291-4659-a6dc-1c348d04799c","Type":"ContainerStarted","Data":"d8c0878213c16373ca9d93567b1e08822d8a5972a551ba27e1cec62a554ee294"} Sep 30 13:51:50 crc kubenswrapper[4763]: E0930 13:51:50.468559 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" podUID="5bf8cc8d-90d0-4687-bfca-fd75f8d1c308" Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.477912 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" event={"ID":"10a71b21-16cb-4064-b639-fbfc2893812a","Type":"ContainerStarted","Data":"f3cba1f4f9e8b51724cd83e979bca46cb71cb64ae626d3a006a93fa9521e6529"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.477952 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" event={"ID":"10a71b21-16cb-4064-b639-fbfc2893812a","Type":"ContainerStarted","Data":"081bb20551ca9cbc8a7846f3b7438dacf8fef516376f83750eed329fbcf3a885"} Sep 30 13:51:50 crc kubenswrapper[4763]: E0930 13:51:50.479154 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" podUID="10a71b21-16cb-4064-b639-fbfc2893812a" Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.483363 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" event={"ID":"5acc6630-bf7d-4acf-b724-60e722171e8f","Type":"ContainerStarted","Data":"60ccf118b886862b1cbaf5184cbf614a8e51e9d3c2ebecb3c156c369c0403b8e"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.483498 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" event={"ID":"5acc6630-bf7d-4acf-b724-60e722171e8f","Type":"ContainerStarted","Data":"40ec4b6d12e5701db9e610e84e4dae016b18b3767a36fda456d835f8f7dc9b4a"} Sep 30 13:51:50 crc kubenswrapper[4763]: E0930 13:51:50.485147 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" podUID="5acc6630-bf7d-4acf-b724-60e722171e8f" Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.486370 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" event={"ID":"ec701ff9-8d7e-43ef-8887-bafe3f09deba","Type":"ContainerStarted","Data":"0d878686f6e97a82889422b94a12d9db0df572d9174c1680f808bf7e0caa2553"} Sep 30 13:51:50 crc kubenswrapper[4763]: E0930 13:51:50.498694 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" podUID="6a1bd649-6042-4f29-b6ab-cb3bcfcdca51" Sep 30 13:51:50 crc kubenswrapper[4763]: E0930 13:51:50.504098 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" podUID="08f0bef7-63c6-4118-a5f3-953efc2e638c" Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.520975 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r" event={"ID":"6efd5b9a-3e7d-4913-930d-3fe4452551b6","Type":"ContainerStarted","Data":"424df5da21edc8bc729834f1848775d11f6943f3e4a7bf58e840d6a5dcb57883"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.521016 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" event={"ID":"6a1bd649-6042-4f29-b6ab-cb3bcfcdca51","Type":"ContainerStarted","Data":"3cb7ce42cc0a3571679f3329f01ca451616d52a591ba50994cb1efb212a9a983"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.521039 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" event={"ID":"6a1bd649-6042-4f29-b6ab-cb3bcfcdca51","Type":"ContainerStarted","Data":"8b00039a1817e6728abed03aca1d12680cdad34b2d6037af3aee7848e7ca5fb6"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.521051 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" event={"ID":"08f0bef7-63c6-4118-a5f3-953efc2e638c","Type":"ContainerStarted","Data":"cad6933891da56ec9ef28509dc0194d891a3a862f796c13738986f9d76e99ad7"} Sep 30 13:51:50 crc kubenswrapper[4763]: I0930 13:51:50.521069 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" event={"ID":"08f0bef7-63c6-4118-a5f3-953efc2e638c","Type":"ContainerStarted","Data":"a19ffddf6f6bdba519402c72cc09b2ef8b0b381dfe8f3dd15e263f0ab964bd43"} Sep 30 13:51:51 crc kubenswrapper[4763]: I0930 13:51:51.518501 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" event={"ID":"d8b604b2-49b7-4471-9e02-161a0caebc4b","Type":"ContainerStarted","Data":"4383ae623b6220c2290c5e420f816f1bc0a827eb44443203a8912706bce4a5a5"} Sep 30 13:51:51 crc kubenswrapper[4763]: I0930 13:51:51.518723 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" event={"ID":"d8b604b2-49b7-4471-9e02-161a0caebc4b","Type":"ContainerStarted","Data":"73dcd035c0bf95df2e4eb11387bbdf6c9af39ab392536424e4c63750d8e933e4"} Sep 30 13:51:51 crc kubenswrapper[4763]: I0930 13:51:51.518735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" event={"ID":"d8b604b2-49b7-4471-9e02-161a0caebc4b","Type":"ContainerStarted","Data":"615615b038745c2e8899ad231c24afe5923b700915ddda3a7a08ad7eb602ea84"} Sep 30 13:51:51 crc kubenswrapper[4763]: I0930 13:51:51.519671 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:51:51 crc kubenswrapper[4763]: E0930 13:51:51.520069 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" podUID="10a71b21-16cb-4064-b639-fbfc2893812a" Sep 30 13:51:51 crc kubenswrapper[4763]: E0930 13:51:51.520479 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" podUID="08f0bef7-63c6-4118-a5f3-953efc2e638c" Sep 30 13:51:51 crc kubenswrapper[4763]: E0930 13:51:51.520530 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" podUID="5d3c4b15-3e62-4fe1-ba6c-37100873dc7e" Sep 30 13:51:51 crc kubenswrapper[4763]: E0930 13:51:51.522527 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" podUID="5acc6630-bf7d-4acf-b724-60e722171e8f" Sep 30 13:51:51 crc kubenswrapper[4763]: E0930 13:51:51.522590 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" podUID="9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd" Sep 30 13:51:51 crc kubenswrapper[4763]: E0930 13:51:51.522667 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" podUID="6a1bd649-6042-4f29-b6ab-cb3bcfcdca51" Sep 30 13:51:51 crc kubenswrapper[4763]: E0930 13:51:51.522716 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" podUID="5bf8cc8d-90d0-4687-bfca-fd75f8d1c308" Sep 30 13:51:58 crc kubenswrapper[4763]: I0930 13:51:58.526624 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" podStartSLOduration=11.526584234 podStartE2EDuration="11.526584234s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:51:51.684955201 +0000 UTC m=+983.823515486" watchObservedRunningTime="2025-09-30 13:51:58.526584234 +0000 UTC m=+990.665144519" Sep 30 13:51:59 crc kubenswrapper[4763]: I0930 13:51:59.629047 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-m2cnh" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.624743 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" event={"ID":"ea3d1e11-a06c-4cc4-af77-725fdafb57c8","Type":"ContainerStarted","Data":"130aa5c5c9fa0d97aef7148acf48b4b051745df7c7562eef97bd535890ee45ac"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.628810 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" event={"ID":"55edd3bf-c291-4659-a6dc-1c348d04799c","Type":"ContainerStarted","Data":"783d75e26214b4dcc080726c3261fa20fbe99a345998b2180546d3aa3d776de9"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.630356 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" event={"ID":"02b9b96d-a908-4964-ac52-b5b8f73ffbef","Type":"ContainerStarted","Data":"ee33788ea20443904e6afcab689618992017e35e56ccfa0272dfec840e4c727e"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.647554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" event={"ID":"3b13bc76-0bcb-48f3-9e18-f04720087325","Type":"ContainerStarted","Data":"2eb683c39620c3875ebcf0f81d6837a873643e4a26d84e23fc15c3333b5030f3"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.652290 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" event={"ID":"4addb186-b77b-4a86-85fc-87604ccb3c09","Type":"ContainerStarted","Data":"f73cccba2f5c514ffaf3d83c1e87535cf404479dec1df7906c28f6af89f08c49"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.652331 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" event={"ID":"4addb186-b77b-4a86-85fc-87604ccb3c09","Type":"ContainerStarted","Data":"3055606ab8f3145fb9e4997ccd84e53ac3be2e86f6a379bb9cbf34a837b27126"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.653189 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.659818 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" event={"ID":"f38cae68-c345-48f7-9be3-ea9467cb5485","Type":"ContainerStarted","Data":"d97f0bf957f1dd8260f8381c1b0356c48c71c4f133ed06429c7d8409ed5d7c80"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.659861 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" event={"ID":"f38cae68-c345-48f7-9be3-ea9467cb5485","Type":"ContainerStarted","Data":"efe8e1ffb5225b2de3b2c2434b3aca1d273f85c75131a378dc5fa7f6a2d91407"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.659964 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.671551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" event={"ID":"a4cba4a2-dc1b-485e-b141-a4d7f82176ac","Type":"ContainerStarted","Data":"3535587ca9169ccf6fa9b8edeaa7dce2c778ca930d73da01e80e7bf725705182"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.675992 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" podStartSLOduration=3.599706199 podStartE2EDuration="15.675981181s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.311752117 +0000 UTC m=+981.450312412" lastFinishedPulling="2025-09-30 13:52:01.388027109 +0000 UTC m=+993.526587394" observedRunningTime="2025-09-30 13:52:02.674862233 +0000 UTC m=+994.813422538" watchObservedRunningTime="2025-09-30 13:52:02.675981181 +0000 UTC m=+994.814541466" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.682102 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" event={"ID":"7dccabec-591d-4737-8977-1aa8b6fd5907","Type":"ContainerStarted","Data":"1a7dacd700962c4f15a9bd80394f7a9d9bf532d651190f21daee6f562b02e608"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.682141 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" event={"ID":"7dccabec-591d-4737-8977-1aa8b6fd5907","Type":"ContainerStarted","Data":"a65e2380803076ef36e3e3309fc1204702b089c667a4a48bd25ab2a2b3b1b300"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.682814 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.684467 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" event={"ID":"14beb357-7d8b-4cbd-bda6-56eddcd765b0","Type":"ContainerStarted","Data":"3ceb3aa00c00f683fd5f94c210a928efdc28534e031c6fbcbd1fe7d4ad7de2b5"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.685783 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r" event={"ID":"6efd5b9a-3e7d-4913-930d-3fe4452551b6","Type":"ContainerStarted","Data":"21c66823214ae68e48422c2eeafc0ac9cb05569bb6b1eb0f06d6f595aa60d329"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.694227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" event={"ID":"1b8d1e87-64b4-4462-a46d-822489fa80f7","Type":"ContainerStarted","Data":"f8ca82e75bae1d6498ef5d44830bfc1cf7e4200d380d3041659dc2be4bec7627"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.709493 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" event={"ID":"008d7fd6-b4bf-44bd-b06f-fc3a8787cb66","Type":"ContainerStarted","Data":"e632423cb7690278c0f46f77d69c359e1f57b836c2687805fda4fc20e103e47e"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.709544 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" event={"ID":"008d7fd6-b4bf-44bd-b06f-fc3a8787cb66","Type":"ContainerStarted","Data":"006318860f19efdadfda4b2cec8f5fb339508ac520eee8134b80b8b7fb204a9f"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.709767 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.720035 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" event={"ID":"29c17248-6b6c-4ab7-8204-0f5d34a30da5","Type":"ContainerStarted","Data":"b0740ebe243ffc5685d43fd5f6d57bd26916f104d7e68f1269887fbb57d6b56f"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.721829 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" event={"ID":"ec701ff9-8d7e-43ef-8887-bafe3f09deba","Type":"ContainerStarted","Data":"a5a988c88fd8cdf0d5e9cca5bc5a2869bcb9e012fd3d6744b45cea5d70a4c6c5"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.723468 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" event={"ID":"8622909b-a085-4553-8bbc-9a33d5f6df74","Type":"ContainerStarted","Data":"bdbd183924c602821010d639c04e3b6c78cf7778e1c056a6af3bc869e958b147"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.723491 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" event={"ID":"8622909b-a085-4553-8bbc-9a33d5f6df74","Type":"ContainerStarted","Data":"f9bcae4457d5d9dee5f5ea5dc8fcab07a9c80ff75cc4f9f6abefc242f920a399"} Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.724125 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.731031 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" podStartSLOduration=4.34882384 podStartE2EDuration="16.731014583s" podCreationTimestamp="2025-09-30 13:51:46 +0000 UTC" firstStartedPulling="2025-09-30 13:51:48.886824207 +0000 UTC m=+981.025384492" lastFinishedPulling="2025-09-30 13:52:01.26901495 +0000 UTC m=+993.407575235" observedRunningTime="2025-09-30 13:52:02.696404095 +0000 UTC m=+994.834964380" watchObservedRunningTime="2025-09-30 13:52:02.731014583 +0000 UTC m=+994.869574868" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.734127 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-fdr5r" podStartSLOduration=3.607129846 podStartE2EDuration="15.734118351s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.332380535 +0000 UTC m=+981.470940820" lastFinishedPulling="2025-09-30 13:52:01.45936904 +0000 UTC m=+993.597929325" observedRunningTime="2025-09-30 13:52:02.728919111 +0000 UTC m=+994.867479396" watchObservedRunningTime="2025-09-30 13:52:02.734118351 +0000 UTC m=+994.872678636" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.812926 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" podStartSLOduration=3.8026924170000003 podStartE2EDuration="15.81290414s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.291016476 +0000 UTC m=+981.429576761" lastFinishedPulling="2025-09-30 13:52:01.301228199 +0000 UTC m=+993.439788484" observedRunningTime="2025-09-30 13:52:02.757890608 +0000 UTC m=+994.896450893" watchObservedRunningTime="2025-09-30 13:52:02.81290414 +0000 UTC m=+994.951464425" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.824233 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" podStartSLOduration=4.283475298 podStartE2EDuration="16.824216564s" podCreationTimestamp="2025-09-30 13:51:46 +0000 UTC" firstStartedPulling="2025-09-30 13:51:48.848104424 +0000 UTC m=+980.986664709" lastFinishedPulling="2025-09-30 13:52:01.38884569 +0000 UTC m=+993.527405975" observedRunningTime="2025-09-30 13:52:02.79180683 +0000 UTC m=+994.930367115" watchObservedRunningTime="2025-09-30 13:52:02.824216564 +0000 UTC m=+994.962776849" Sep 30 13:52:02 crc kubenswrapper[4763]: I0930 13:52:02.844690 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" podStartSLOduration=3.942431465 podStartE2EDuration="15.844666567s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.350678704 +0000 UTC m=+981.489238989" lastFinishedPulling="2025-09-30 13:52:01.252913806 +0000 UTC m=+993.391474091" observedRunningTime="2025-09-30 13:52:02.826516681 +0000 UTC m=+994.965076966" watchObservedRunningTime="2025-09-30 13:52:02.844666567 +0000 UTC m=+994.983226852" Sep 30 13:52:03 crc kubenswrapper[4763]: I0930 13:52:03.733825 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" event={"ID":"a4cba4a2-dc1b-485e-b141-a4d7f82176ac","Type":"ContainerStarted","Data":"319eddd9b712316a3a8c14181ebfc556401031468de0bf4da6310f40ec3e1e06"} Sep 30 13:52:03 crc kubenswrapper[4763]: I0930 13:52:03.736442 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" event={"ID":"29c17248-6b6c-4ab7-8204-0f5d34a30da5","Type":"ContainerStarted","Data":"702698e19da9264e5961e9b6c0e2cf947f35e9c0653882d97c6a4a04c8723dc5"} Sep 30 13:52:05 crc kubenswrapper[4763]: I0930 13:52:05.756066 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" Sep 30 13:52:05 crc kubenswrapper[4763]: I0930 13:52:05.779242 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" podStartSLOduration=6.640298301 podStartE2EDuration="19.779195517s" podCreationTimestamp="2025-09-30 13:51:46 +0000 UTC" firstStartedPulling="2025-09-30 13:51:48.112154823 +0000 UTC m=+980.250715108" lastFinishedPulling="2025-09-30 13:52:01.251052039 +0000 UTC m=+993.389612324" observedRunningTime="2025-09-30 13:52:05.77290101 +0000 UTC m=+997.911461295" watchObservedRunningTime="2025-09-30 13:52:05.779195517 +0000 UTC m=+997.917755802" Sep 30 13:52:06 crc kubenswrapper[4763]: I0930 13:52:06.777863 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-8s74j" Sep 30 13:52:07 crc kubenswrapper[4763]: I0930 13:52:07.250917 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-rpc2r" Sep 30 13:52:07 crc kubenswrapper[4763]: I0930 13:52:07.318864 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-vkwpn" Sep 30 13:52:07 crc kubenswrapper[4763]: I0930 13:52:07.780483 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-6grkh" Sep 30 13:52:07 crc kubenswrapper[4763]: I0930 13:52:07.839222 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-22fzq" Sep 30 13:52:07 crc kubenswrapper[4763]: I0930 13:52:07.839850 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-qbqvs" Sep 30 13:52:08 crc kubenswrapper[4763]: I0930 13:52:08.789780 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" event={"ID":"1b8d1e87-64b4-4462-a46d-822489fa80f7","Type":"ContainerStarted","Data":"82d3d610e5ada7461005b67f3c7f2b15aad2dd39e49310bca4d50470384a8c6d"} Sep 30 13:52:08 crc kubenswrapper[4763]: I0930 13:52:08.791667 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" event={"ID":"ea3d1e11-a06c-4cc4-af77-725fdafb57c8","Type":"ContainerStarted","Data":"f1729923554a95ef7bd890067ba1009be5af21e92a36187f543a2fc841203fec"} Sep 30 13:52:08 crc kubenswrapper[4763]: I0930 13:52:08.793867 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" event={"ID":"55edd3bf-c291-4659-a6dc-1c348d04799c","Type":"ContainerStarted","Data":"5cc59a21b556f069ea1e32c1bc7f00a81ccc415aead096c97522db8ceaf31ee7"} Sep 30 13:52:08 crc kubenswrapper[4763]: I0930 13:52:08.796357 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" event={"ID":"02b9b96d-a908-4964-ac52-b5b8f73ffbef","Type":"ContainerStarted","Data":"352bd610b4064b7fec352406db29475b7bfe7eee8bfe0567719e1ddd65ddfd6f"} Sep 30 13:52:08 crc kubenswrapper[4763]: I0930 13:52:08.798569 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" event={"ID":"ec701ff9-8d7e-43ef-8887-bafe3f09deba","Type":"ContainerStarted","Data":"c0ad2c6c83e9464ed7f6de802c3cc68bca3650f6439e3e8dd06d7d0f3ae3f12d"} Sep 30 13:52:08 crc kubenswrapper[4763]: I0930 13:52:08.800712 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" event={"ID":"14beb357-7d8b-4cbd-bda6-56eddcd765b0","Type":"ContainerStarted","Data":"2967683aa988e0dc9767fb015ed26b4cf16d2d7b53046aad05d40a43849e4f4f"} Sep 30 13:52:08 crc kubenswrapper[4763]: I0930 13:52:08.801006 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" Sep 30 13:52:08 crc kubenswrapper[4763]: I0930 13:52:08.802986 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" Sep 30 13:52:08 crc kubenswrapper[4763]: I0930 13:52:08.822559 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-lhnzv" podStartSLOduration=10.177071055 podStartE2EDuration="22.82254104s" podCreationTimestamp="2025-09-30 13:51:46 +0000 UTC" firstStartedPulling="2025-09-30 13:51:48.747166079 +0000 UTC m=+980.885726364" lastFinishedPulling="2025-09-30 13:52:01.392636064 +0000 UTC m=+993.531196349" observedRunningTime="2025-09-30 13:52:08.818512079 +0000 UTC m=+1000.957072364" watchObservedRunningTime="2025-09-30 13:52:08.82254104 +0000 UTC m=+1000.961101325" Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.815256 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" event={"ID":"3b13bc76-0bcb-48f3-9e18-f04720087325","Type":"ContainerStarted","Data":"17f4325eeb9c5cc6f014658c5e40ba2a9d01490bb66cf6d8736c030aa778a770"} Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.816345 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.816385 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.818081 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.818338 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.857868 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-hkxs6" podStartSLOduration=11.937710516 podStartE2EDuration="23.857842227s" podCreationTimestamp="2025-09-30 13:51:46 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.350039368 +0000 UTC m=+981.488599653" lastFinishedPulling="2025-09-30 13:52:01.270171079 +0000 UTC m=+993.408731364" observedRunningTime="2025-09-30 13:52:09.838280116 +0000 UTC m=+1001.976840421" watchObservedRunningTime="2025-09-30 13:52:09.857842227 +0000 UTC m=+1001.996402512" Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.859194 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-8ftj5" podStartSLOduration=11.334447677 podStartE2EDuration="23.859185851s" podCreationTimestamp="2025-09-30 13:51:46 +0000 UTC" firstStartedPulling="2025-09-30 13:51:48.863275555 +0000 UTC m=+981.001835840" lastFinishedPulling="2025-09-30 13:52:01.388013729 +0000 UTC m=+993.526574014" observedRunningTime="2025-09-30 13:52:09.855290643 +0000 UTC m=+1001.993850958" watchObservedRunningTime="2025-09-30 13:52:09.859185851 +0000 UTC m=+1001.997746136" Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.885868 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" podStartSLOduration=10.923424747 podStartE2EDuration="22.8858203s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.290654947 +0000 UTC m=+981.429215232" lastFinishedPulling="2025-09-30 13:52:01.2530505 +0000 UTC m=+993.391610785" observedRunningTime="2025-09-30 13:52:09.877262185 +0000 UTC m=+1002.015822490" watchObservedRunningTime="2025-09-30 13:52:09.8858203 +0000 UTC m=+1002.024380585" Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.922536 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" podStartSLOduration=10.957922165 podStartE2EDuration="22.922509172s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.345156476 +0000 UTC m=+981.483716761" lastFinishedPulling="2025-09-30 13:52:01.309743483 +0000 UTC m=+993.448303768" observedRunningTime="2025-09-30 13:52:09.915170287 +0000 UTC m=+1002.053730562" watchObservedRunningTime="2025-09-30 13:52:09.922509172 +0000 UTC m=+1002.061069447" Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.943783 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" podStartSLOduration=10.901141619 podStartE2EDuration="22.943764136s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.350060488 +0000 UTC m=+981.488620773" lastFinishedPulling="2025-09-30 13:52:01.392683005 +0000 UTC m=+993.531243290" observedRunningTime="2025-09-30 13:52:09.939902008 +0000 UTC m=+1002.078462293" watchObservedRunningTime="2025-09-30 13:52:09.943764136 +0000 UTC m=+1002.082324421" Sep 30 13:52:09 crc kubenswrapper[4763]: I0930 13:52:09.998570 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" podStartSLOduration=10.804947613 podStartE2EDuration="22.998548151s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.194951774 +0000 UTC m=+981.333512059" lastFinishedPulling="2025-09-30 13:52:01.388552312 +0000 UTC m=+993.527112597" observedRunningTime="2025-09-30 13:52:09.995343441 +0000 UTC m=+1002.133903726" watchObservedRunningTime="2025-09-30 13:52:09.998548151 +0000 UTC m=+1002.137108446" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.847196 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" event={"ID":"9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd","Type":"ContainerStarted","Data":"fa1544d53c11a234f2246e4698052a08d162206b73e6bc05b10a96b1e0abeb95"} Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.847932 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.850268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" event={"ID":"6a1bd649-6042-4f29-b6ab-cb3bcfcdca51","Type":"ContainerStarted","Data":"a0ede5f2e79bc71a4fe668ce8793dfaf74f13c4dd81aada304b6284d720dd2e9"} Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.850448 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.852141 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" event={"ID":"5bf8cc8d-90d0-4687-bfca-fd75f8d1c308","Type":"ContainerStarted","Data":"1b3be2f8d590eb20e9003b901cc74f44ca8c83452d5a8d8b4c163fa57039dde3"} Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.852347 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.854811 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" event={"ID":"08f0bef7-63c6-4118-a5f3-953efc2e638c","Type":"ContainerStarted","Data":"92b5ccb614bf188f52718897ece2abe9231c924a33ae82047f206c84b9e76fe8"} Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.855114 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.857562 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" event={"ID":"10a71b21-16cb-4064-b639-fbfc2893812a","Type":"ContainerStarted","Data":"1743798b636955a6f30f0deaefd4f7cad79fc4d60e10a5535829a750bc3896e1"} Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.857784 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.859501 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" event={"ID":"5d3c4b15-3e62-4fe1-ba6c-37100873dc7e","Type":"ContainerStarted","Data":"8057fab0b36a05d6b86822fd66ac2b49685f79d911ad3e307df989c161eec139"} Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.859798 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.861952 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" event={"ID":"5acc6630-bf7d-4acf-b724-60e722171e8f","Type":"ContainerStarted","Data":"622970aed5c99b23f4e63ffca9c0ac374808de2576a2b44730320bf0c4f019e5"} Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.862123 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.880676 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" podStartSLOduration=3.152183001 podStartE2EDuration="26.880659486s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.378896533 +0000 UTC m=+981.517456818" lastFinishedPulling="2025-09-30 13:52:13.107373018 +0000 UTC m=+1005.245933303" observedRunningTime="2025-09-30 13:52:13.871264241 +0000 UTC m=+1006.009824526" watchObservedRunningTime="2025-09-30 13:52:13.880659486 +0000 UTC m=+1006.019219771" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.881968 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" podStartSLOduration=16.186636533 podStartE2EDuration="27.881962589s" podCreationTimestamp="2025-09-30 13:51:46 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.693415791 +0000 UTC m=+981.831976066" lastFinishedPulling="2025-09-30 13:52:01.388741837 +0000 UTC m=+993.527302122" observedRunningTime="2025-09-30 13:52:10.022176425 +0000 UTC m=+1002.160736720" watchObservedRunningTime="2025-09-30 13:52:13.881962589 +0000 UTC m=+1006.020522884" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.913733 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" podStartSLOduration=3.107251602 podStartE2EDuration="26.913710926s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.37919922 +0000 UTC m=+981.517759505" lastFinishedPulling="2025-09-30 13:52:13.185658544 +0000 UTC m=+1005.324218829" observedRunningTime="2025-09-30 13:52:13.908322991 +0000 UTC m=+1006.046883266" watchObservedRunningTime="2025-09-30 13:52:13.913710926 +0000 UTC m=+1006.052271211" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.933921 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" podStartSLOduration=3.182968374 podStartE2EDuration="26.933905163s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.356559602 +0000 UTC m=+981.495119887" lastFinishedPulling="2025-09-30 13:52:13.107496391 +0000 UTC m=+1005.246056676" observedRunningTime="2025-09-30 13:52:13.932734814 +0000 UTC m=+1006.071295099" watchObservedRunningTime="2025-09-30 13:52:13.933905163 +0000 UTC m=+1006.072465448" Sep 30 13:52:13 crc kubenswrapper[4763]: I0930 13:52:13.991536 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" podStartSLOduration=3.263189868 podStartE2EDuration="26.991520981s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.379520348 +0000 UTC m=+981.518080643" lastFinishedPulling="2025-09-30 13:52:13.107851471 +0000 UTC m=+1005.246411756" observedRunningTime="2025-09-30 13:52:13.98909875 +0000 UTC m=+1006.127659035" watchObservedRunningTime="2025-09-30 13:52:13.991520981 +0000 UTC m=+1006.130081266" Sep 30 13:52:14 crc kubenswrapper[4763]: I0930 13:52:14.036418 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" podStartSLOduration=3.308424505 podStartE2EDuration="27.036399858s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.379390395 +0000 UTC m=+981.517950680" lastFinishedPulling="2025-09-30 13:52:13.107365748 +0000 UTC m=+1005.245926033" observedRunningTime="2025-09-30 13:52:14.01739841 +0000 UTC m=+1006.155958695" watchObservedRunningTime="2025-09-30 13:52:14.036399858 +0000 UTC m=+1006.174960143" Sep 30 13:52:14 crc kubenswrapper[4763]: I0930 13:52:14.059379 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" podStartSLOduration=3.277044578 podStartE2EDuration="27.059360034s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.354746697 +0000 UTC m=+981.493307092" lastFinishedPulling="2025-09-30 13:52:13.137062263 +0000 UTC m=+1005.275622548" observedRunningTime="2025-09-30 13:52:14.039780142 +0000 UTC m=+1006.178340427" watchObservedRunningTime="2025-09-30 13:52:14.059360034 +0000 UTC m=+1006.197920319" Sep 30 13:52:14 crc kubenswrapper[4763]: I0930 13:52:14.060373 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" podStartSLOduration=3.32417979 podStartE2EDuration="27.060366569s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:49.371662151 +0000 UTC m=+981.510222436" lastFinishedPulling="2025-09-30 13:52:13.10784893 +0000 UTC m=+1005.246409215" observedRunningTime="2025-09-30 13:52:14.057580169 +0000 UTC m=+1006.196140444" watchObservedRunningTime="2025-09-30 13:52:14.060366569 +0000 UTC m=+1006.198926854" Sep 30 13:52:17 crc kubenswrapper[4763]: I0930 13:52:17.674541 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" Sep 30 13:52:17 crc kubenswrapper[4763]: I0930 13:52:17.677450 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-2sn8f" Sep 30 13:52:18 crc kubenswrapper[4763]: I0930 13:52:18.041939 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" Sep 30 13:52:18 crc kubenswrapper[4763]: I0930 13:52:18.045652 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-p95zl" Sep 30 13:52:18 crc kubenswrapper[4763]: I0930 13:52:18.066200 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" Sep 30 13:52:18 crc kubenswrapper[4763]: I0930 13:52:18.069049 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-v8f7j" Sep 30 13:52:18 crc kubenswrapper[4763]: I0930 13:52:18.087103 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" Sep 30 13:52:18 crc kubenswrapper[4763]: I0930 13:52:18.092640 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-r5sn8" Sep 30 13:52:18 crc kubenswrapper[4763]: I0930 13:52:18.100299 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-72d45" Sep 30 13:52:18 crc kubenswrapper[4763]: I0930 13:52:18.561963 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds" Sep 30 13:52:19 crc kubenswrapper[4763]: I0930 13:52:19.252450 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:52:19 crc kubenswrapper[4763]: I0930 13:52:19.259582 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g7cnm" Sep 30 13:52:27 crc kubenswrapper[4763]: I0930 13:52:27.678553 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-594nq" Sep 30 13:52:27 crc kubenswrapper[4763]: I0930 13:52:27.679183 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-6mp69" Sep 30 13:52:27 crc kubenswrapper[4763]: I0930 13:52:27.802144 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-vtdfz" Sep 30 13:52:27 crc kubenswrapper[4763]: I0930 13:52:27.827205 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-jkggm" Sep 30 13:52:27 crc kubenswrapper[4763]: I0930 13:52:27.875783 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-wnh7g" Sep 30 13:52:41 crc kubenswrapper[4763]: I0930 13:52:41.909084 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-bsrnl"] Sep 30 13:52:41 crc kubenswrapper[4763]: I0930 13:52:41.910899 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:41 crc kubenswrapper[4763]: I0930 13:52:41.913539 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 13:52:41 crc kubenswrapper[4763]: I0930 13:52:41.913898 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 13:52:41 crc kubenswrapper[4763]: I0930 13:52:41.913898 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 13:52:41 crc kubenswrapper[4763]: I0930 13:52:41.914228 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 13:52:41 crc kubenswrapper[4763]: I0930 13:52:41.914286 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-r7jkl" Sep 30 13:52:41 crc kubenswrapper[4763]: I0930 13:52:41.924223 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-bsrnl"] Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.038327 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-dns-svc\") pod \"dnsmasq-dns-d5f6f49c7-bsrnl\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.038382 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-config\") pod \"dnsmasq-dns-d5f6f49c7-bsrnl\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.038404 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plsrt\" (UniqueName: \"kubernetes.io/projected/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-kube-api-access-plsrt\") pod \"dnsmasq-dns-d5f6f49c7-bsrnl\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.139734 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-dns-svc\") pod \"dnsmasq-dns-d5f6f49c7-bsrnl\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.140089 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-config\") pod \"dnsmasq-dns-d5f6f49c7-bsrnl\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.140116 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plsrt\" (UniqueName: \"kubernetes.io/projected/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-kube-api-access-plsrt\") pod \"dnsmasq-dns-d5f6f49c7-bsrnl\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.141562 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-config\") pod \"dnsmasq-dns-d5f6f49c7-bsrnl\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.142150 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-dns-svc\") pod \"dnsmasq-dns-d5f6f49c7-bsrnl\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.159317 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plsrt\" (UniqueName: \"kubernetes.io/projected/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-kube-api-access-plsrt\") pod \"dnsmasq-dns-d5f6f49c7-bsrnl\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.237806 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.647392 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-bsrnl"] Sep 30 13:52:42 crc kubenswrapper[4763]: I0930 13:52:42.654918 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:52:43 crc kubenswrapper[4763]: I0930 13:52:43.068034 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" event={"ID":"5a941342-6ab0-49b5-9a11-2cbd7d3367d8","Type":"ContainerStarted","Data":"6acebec9c0bbe8c3c5b4bda39e7b53eb98bc0541024dcf97a90f303fffb035aa"} Sep 30 13:52:44 crc kubenswrapper[4763]: I0930 13:52:44.856757 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-b4p76"] Sep 30 13:52:44 crc kubenswrapper[4763]: I0930 13:52:44.858997 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:44 crc kubenswrapper[4763]: I0930 13:52:44.866370 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-b4p76"] Sep 30 13:52:44 crc kubenswrapper[4763]: I0930 13:52:44.989960 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-config\") pod \"dnsmasq-dns-b6f94bdfc-b4p76\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:44 crc kubenswrapper[4763]: I0930 13:52:44.990053 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w29bs\" (UniqueName: \"kubernetes.io/projected/d5cacb83-c744-4798-afff-0736b0938677-kube-api-access-w29bs\") pod \"dnsmasq-dns-b6f94bdfc-b4p76\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:44 crc kubenswrapper[4763]: I0930 13:52:44.990104 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-dns-svc\") pod \"dnsmasq-dns-b6f94bdfc-b4p76\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.094322 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-config\") pod \"dnsmasq-dns-b6f94bdfc-b4p76\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.094392 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w29bs\" (UniqueName: \"kubernetes.io/projected/d5cacb83-c744-4798-afff-0736b0938677-kube-api-access-w29bs\") pod \"dnsmasq-dns-b6f94bdfc-b4p76\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.094412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-dns-svc\") pod \"dnsmasq-dns-b6f94bdfc-b4p76\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.095337 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-dns-svc\") pod \"dnsmasq-dns-b6f94bdfc-b4p76\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.095551 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-config\") pod \"dnsmasq-dns-b6f94bdfc-b4p76\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.131292 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w29bs\" (UniqueName: \"kubernetes.io/projected/d5cacb83-c744-4798-afff-0736b0938677-kube-api-access-w29bs\") pod \"dnsmasq-dns-b6f94bdfc-b4p76\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.147861 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-bsrnl"] Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.177820 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-mlfnr"] Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.178828 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.179271 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.187216 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-mlfnr"] Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.297908 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-dns-svc\") pod \"dnsmasq-dns-77795d58f5-mlfnr\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.298001 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-config\") pod \"dnsmasq-dns-77795d58f5-mlfnr\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.298030 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnxt2\" (UniqueName: \"kubernetes.io/projected/899b170a-9f2f-4275-afcc-a78446c89728-kube-api-access-pnxt2\") pod \"dnsmasq-dns-77795d58f5-mlfnr\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.400210 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-config\") pod \"dnsmasq-dns-77795d58f5-mlfnr\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.400262 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnxt2\" (UniqueName: \"kubernetes.io/projected/899b170a-9f2f-4275-afcc-a78446c89728-kube-api-access-pnxt2\") pod \"dnsmasq-dns-77795d58f5-mlfnr\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.400340 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-dns-svc\") pod \"dnsmasq-dns-77795d58f5-mlfnr\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.402001 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-config\") pod \"dnsmasq-dns-77795d58f5-mlfnr\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.402484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-dns-svc\") pod \"dnsmasq-dns-77795d58f5-mlfnr\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.429564 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnxt2\" (UniqueName: \"kubernetes.io/projected/899b170a-9f2f-4275-afcc-a78446c89728-kube-api-access-pnxt2\") pod \"dnsmasq-dns-77795d58f5-mlfnr\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.493842 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.658541 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-b4p76"] Sep 30 13:52:45 crc kubenswrapper[4763]: W0930 13:52:45.666484 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5cacb83_c744_4798_afff_0736b0938677.slice/crio-e8a33c2037b714d97bcc5c7407386b540defb37e4f0cab2ed73326f203b4fff2 WatchSource:0}: Error finding container e8a33c2037b714d97bcc5c7407386b540defb37e4f0cab2ed73326f203b4fff2: Status 404 returned error can't find the container with id e8a33c2037b714d97bcc5c7407386b540defb37e4f0cab2ed73326f203b4fff2 Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.912623 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-mlfnr"] Sep 30 13:52:45 crc kubenswrapper[4763]: W0930 13:52:45.916727 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod899b170a_9f2f_4275_afcc_a78446c89728.slice/crio-d100aa5a9c39898d37d500e316b65b8fe2af3bbfc2c6ff0c46be290964292176 WatchSource:0}: Error finding container d100aa5a9c39898d37d500e316b65b8fe2af3bbfc2c6ff0c46be290964292176: Status 404 returned error can't find the container with id d100aa5a9c39898d37d500e316b65b8fe2af3bbfc2c6ff0c46be290964292176 Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.998058 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 13:52:45 crc kubenswrapper[4763]: I0930 13:52:45.999226 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.002248 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.002344 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.002383 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.002383 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.002383 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.003180 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.003778 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5gdpz" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.012655 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.088822 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" event={"ID":"899b170a-9f2f-4275-afcc-a78446c89728","Type":"ContainerStarted","Data":"d100aa5a9c39898d37d500e316b65b8fe2af3bbfc2c6ff0c46be290964292176"} Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.089960 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" event={"ID":"d5cacb83-c744-4798-afff-0736b0938677","Type":"ContainerStarted","Data":"e8a33c2037b714d97bcc5c7407386b540defb37e4f0cab2ed73326f203b4fff2"} Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110023 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110173 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110302 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110326 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110357 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110415 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110482 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3119638a-6580-4a24-8e7f-40f7f7d788a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110508 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110548 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110565 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3119638a-6580-4a24-8e7f-40f7f7d788a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.110645 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5kwk\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-kube-api-access-f5kwk\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.211710 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.211798 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.211820 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.211844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.211865 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.211904 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3119638a-6580-4a24-8e7f-40f7f7d788a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.211930 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.211952 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.211968 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3119638a-6580-4a24-8e7f-40f7f7d788a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.211995 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5kwk\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-kube-api-access-f5kwk\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.212034 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.212209 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.212571 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.212614 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.212963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.213354 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.213372 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.218432 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3119638a-6580-4a24-8e7f-40f7f7d788a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.218438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3119638a-6580-4a24-8e7f-40f7f7d788a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.218560 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.219148 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.233415 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.234740 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5kwk\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-kube-api-access-f5kwk\") pod \"rabbitmq-server-0\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.307196 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.308882 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.312844 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.312919 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.312990 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.313012 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.313085 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.313114 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.313349 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pb6rp" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.313384 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.327795 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.413876 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.414186 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.414222 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.414242 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzs9q\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-kube-api-access-pzs9q\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.414265 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.414292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.414316 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.414346 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.414367 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.414396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.414454 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515610 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515734 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515755 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515772 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515791 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzs9q\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-kube-api-access-pzs9q\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515814 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515842 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515874 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515911 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.515946 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.516274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.516371 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.516453 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.516831 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.517443 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.517511 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.519309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.519847 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.520775 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.521926 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.536025 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzs9q\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-kube-api-access-pzs9q\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.536376 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.637584 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:52:46 crc kubenswrapper[4763]: I0930 13:52:46.754401 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 13:52:46 crc kubenswrapper[4763]: W0930 13:52:46.758500 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3119638a_6580_4a24_8e7f_40f7f7d788a5.slice/crio-0f74623c31148386b7f2ed927d954b6a1888fb4461a3a1afe66adce24f49d293 WatchSource:0}: Error finding container 0f74623c31148386b7f2ed927d954b6a1888fb4461a3a1afe66adce24f49d293: Status 404 returned error can't find the container with id 0f74623c31148386b7f2ed927d954b6a1888fb4461a3a1afe66adce24f49d293 Sep 30 13:52:47 crc kubenswrapper[4763]: I0930 13:52:47.043688 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 13:52:47 crc kubenswrapper[4763]: W0930 13:52:47.050278 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaebd5213_18eb_4d84_b39e_fd22f9ff9a6c.slice/crio-39be8740061f3f51aef9766adac9cb7379938198ef064858f5d80903a2095993 WatchSource:0}: Error finding container 39be8740061f3f51aef9766adac9cb7379938198ef064858f5d80903a2095993: Status 404 returned error can't find the container with id 39be8740061f3f51aef9766adac9cb7379938198ef064858f5d80903a2095993 Sep 30 13:52:47 crc kubenswrapper[4763]: I0930 13:52:47.096789 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c","Type":"ContainerStarted","Data":"39be8740061f3f51aef9766adac9cb7379938198ef064858f5d80903a2095993"} Sep 30 13:52:47 crc kubenswrapper[4763]: I0930 13:52:47.097959 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3119638a-6580-4a24-8e7f-40f7f7d788a5","Type":"ContainerStarted","Data":"0f74623c31148386b7f2ed927d954b6a1888fb4461a3a1afe66adce24f49d293"} Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.719442 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.734998 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.739389 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.739739 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.740571 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-gtlc2" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.744799 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.746888 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.746904 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.747858 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.761836 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.763260 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.766760 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-2bt25" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.767389 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.767582 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.768423 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.774002 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859352 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859400 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859423 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859440 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859456 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859482 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-secrets\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859518 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859533 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859554 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859574 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859591 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2c5264e-b119-4444-b954-c33b428294b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859634 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859669 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j6rf\" (UniqueName: \"kubernetes.io/projected/e2c5264e-b119-4444-b954-c33b428294b5-kube-api-access-6j6rf\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859702 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859740 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mlb\" (UniqueName: \"kubernetes.io/projected/2b7af94e-accb-45ca-af30-c489c8d77b12-kube-api-access-28mlb\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.859761 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.961586 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-secrets\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.961667 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.961699 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.961740 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.961778 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.961809 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2c5264e-b119-4444-b954-c33b428294b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.961851 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.961886 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j6rf\" (UniqueName: \"kubernetes.io/projected/e2c5264e-b119-4444-b954-c33b428294b5-kube-api-access-6j6rf\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.961932 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.961964 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.962000 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28mlb\" (UniqueName: \"kubernetes.io/projected/2b7af94e-accb-45ca-af30-c489c8d77b12-kube-api-access-28mlb\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.962035 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.962053 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.962086 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.962123 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.962156 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.962185 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.962209 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.962248 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.963206 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.964442 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.964751 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.965384 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2c5264e-b119-4444-b954-c33b428294b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.965476 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.968459 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.969119 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.969417 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.969671 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.970553 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.980147 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.980903 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.982058 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.983202 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:48 crc kubenswrapper[4763]: I0930 13:52:48.994007 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-secrets\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.003888 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j6rf\" (UniqueName: \"kubernetes.io/projected/e2c5264e-b119-4444-b954-c33b428294b5-kube-api-access-6j6rf\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.005760 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mlb\" (UniqueName: \"kubernetes.io/projected/2b7af94e-accb-45ca-af30-c489c8d77b12-kube-api-access-28mlb\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.008515 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " pod="openstack/openstack-galera-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.015782 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.072123 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.088586 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.287293 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.290945 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.293186 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8f826" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.293505 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.299795 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.300134 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.377250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kolla-config\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.377344 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.377392 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft9rn\" (UniqueName: \"kubernetes.io/projected/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kube-api-access-ft9rn\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.377452 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.377673 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-config-data\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.479624 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-config-data\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.479771 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kolla-config\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.479825 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.479861 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft9rn\" (UniqueName: \"kubernetes.io/projected/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kube-api-access-ft9rn\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.479902 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.480475 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-config-data\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.480742 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kolla-config\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.486121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.488094 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.500644 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft9rn\" (UniqueName: \"kubernetes.io/projected/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kube-api-access-ft9rn\") pod \"memcached-0\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " pod="openstack/memcached-0" Sep 30 13:52:49 crc kubenswrapper[4763]: I0930 13:52:49.644639 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 13:52:50 crc kubenswrapper[4763]: I0930 13:52:50.842931 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:52:50 crc kubenswrapper[4763]: I0930 13:52:50.845149 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:52:50 crc kubenswrapper[4763]: I0930 13:52:50.850441 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8bhbp" Sep 30 13:52:50 crc kubenswrapper[4763]: I0930 13:52:50.851837 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:52:50 crc kubenswrapper[4763]: I0930 13:52:50.938556 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k9ck\" (UniqueName: \"kubernetes.io/projected/4412eaea-f645-451a-8b88-c562357c6b1e-kube-api-access-6k9ck\") pod \"kube-state-metrics-0\" (UID: \"4412eaea-f645-451a-8b88-c562357c6b1e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:52:51 crc kubenswrapper[4763]: I0930 13:52:51.039576 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k9ck\" (UniqueName: \"kubernetes.io/projected/4412eaea-f645-451a-8b88-c562357c6b1e-kube-api-access-6k9ck\") pod \"kube-state-metrics-0\" (UID: \"4412eaea-f645-451a-8b88-c562357c6b1e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:52:51 crc kubenswrapper[4763]: I0930 13:52:51.063255 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k9ck\" (UniqueName: \"kubernetes.io/projected/4412eaea-f645-451a-8b88-c562357c6b1e-kube-api-access-6k9ck\") pod \"kube-state-metrics-0\" (UID: \"4412eaea-f645-451a-8b88-c562357c6b1e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:52:51 crc kubenswrapper[4763]: I0930 13:52:51.166666 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.611968 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kwz5v"] Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.613731 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.615853 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-85jdj" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.617289 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.617542 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.625920 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kwz5v"] Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.634848 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-72z5c"] Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.636485 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.662812 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-72z5c"] Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751381 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08cae05d-3853-4e7a-a66c-380c023d086b-scripts\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751524 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-etc-ovs\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751563 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run-ovn\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751612 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-run\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751631 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z6x5\" (UniqueName: \"kubernetes.io/projected/08cae05d-3853-4e7a-a66c-380c023d086b-kube-api-access-4z6x5\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751652 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-combined-ca-bundle\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751672 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-log\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751690 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1db73295-0655-443c-91e0-2cd08b119141-scripts\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751711 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-lib\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751758 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-ovn-controller-tls-certs\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751797 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-log-ovn\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.751828 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs7n8\" (UniqueName: \"kubernetes.io/projected/1db73295-0655-443c-91e0-2cd08b119141-kube-api-access-rs7n8\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853350 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-log-ovn\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853411 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs7n8\" (UniqueName: \"kubernetes.io/projected/1db73295-0655-443c-91e0-2cd08b119141-kube-api-access-rs7n8\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853433 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08cae05d-3853-4e7a-a66c-380c023d086b-scripts\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853458 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-etc-ovs\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853488 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run-ovn\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853519 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-run\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853554 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z6x5\" (UniqueName: \"kubernetes.io/projected/08cae05d-3853-4e7a-a66c-380c023d086b-kube-api-access-4z6x5\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-combined-ca-bundle\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-log\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853643 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1db73295-0655-443c-91e0-2cd08b119141-scripts\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853661 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853685 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-lib\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.853703 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-ovn-controller-tls-certs\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.854250 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-log-ovn\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.854290 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.854862 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-log\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.855022 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-etc-ovs\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.856006 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08cae05d-3853-4e7a-a66c-380c023d086b-scripts\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.856212 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-lib\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.856434 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1db73295-0655-443c-91e0-2cd08b119141-scripts\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.856713 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run-ovn\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.856985 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-run\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.861204 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-combined-ca-bundle\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.862418 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-ovn-controller-tls-certs\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.872814 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z6x5\" (UniqueName: \"kubernetes.io/projected/08cae05d-3853-4e7a-a66c-380c023d086b-kube-api-access-4z6x5\") pod \"ovn-controller-ovs-72z5c\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.874164 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs7n8\" (UniqueName: \"kubernetes.io/projected/1db73295-0655-443c-91e0-2cd08b119141-kube-api-access-rs7n8\") pod \"ovn-controller-kwz5v\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.938310 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kwz5v" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.965390 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:52:54 crc kubenswrapper[4763]: I0930 13:52:54.998665 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.002457 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.004842 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.005144 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-qxgrz" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.005776 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.005979 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.006192 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.010460 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.158626 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.158669 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.158764 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmd79\" (UniqueName: \"kubernetes.io/projected/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-kube-api-access-kmd79\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.158793 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-config\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.158810 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.158827 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.158851 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.159014 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.260214 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmd79\" (UniqueName: \"kubernetes.io/projected/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-kube-api-access-kmd79\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.260273 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-config\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.260342 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.260371 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.260398 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.260450 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.260480 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.260499 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.260819 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.261032 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.261895 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-config\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.261921 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.264260 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.267926 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.268944 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.278781 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmd79\" (UniqueName: \"kubernetes.io/projected/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-kube-api-access-kmd79\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.284915 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:55 crc kubenswrapper[4763]: I0930 13:52:55.340699 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 13:52:57 crc kubenswrapper[4763]: I0930 13:52:57.907540 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 13:52:57 crc kubenswrapper[4763]: I0930 13:52:57.910297 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:57 crc kubenswrapper[4763]: I0930 13:52:57.912805 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 13:52:57 crc kubenswrapper[4763]: I0930 13:52:57.913241 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 13:52:57 crc kubenswrapper[4763]: I0930 13:52:57.913983 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-sc2jh" Sep 30 13:52:57 crc kubenswrapper[4763]: I0930 13:52:57.915677 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 13:52:57 crc kubenswrapper[4763]: I0930 13:52:57.918635 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.013066 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9tjf\" (UniqueName: \"kubernetes.io/projected/b611e133-1d4a-49a8-9632-bdb825d41fa4-kube-api-access-k9tjf\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.013125 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.013175 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.013315 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.013394 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-config\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.013461 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.013649 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.013957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.115592 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.115675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9tjf\" (UniqueName: \"kubernetes.io/projected/b611e133-1d4a-49a8-9632-bdb825d41fa4-kube-api-access-k9tjf\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.115714 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.115759 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.115781 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.115801 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-config\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.115831 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.115901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.116166 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.118998 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.119019 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-config\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.119433 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.123753 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.125335 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.134121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9tjf\" (UniqueName: \"kubernetes.io/projected/b611e133-1d4a-49a8-9632-bdb825d41fa4-kube-api-access-k9tjf\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.134677 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.138360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:52:58 crc kubenswrapper[4763]: I0930 13:52:58.232478 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 13:53:00 crc kubenswrapper[4763]: I0930 13:53:00.521072 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 13:53:00 crc kubenswrapper[4763]: W0930 13:53:00.921230 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2c5264e_b119_4444_b954_c33b428294b5.slice/crio-026035cf8cc1caa067dae1549ea6b96a1ec29587d04378d9dc2ddf53c3015c3a WatchSource:0}: Error finding container 026035cf8cc1caa067dae1549ea6b96a1ec29587d04378d9dc2ddf53c3015c3a: Status 404 returned error can't find the container with id 026035cf8cc1caa067dae1549ea6b96a1ec29587d04378d9dc2ddf53c3015c3a Sep 30 13:53:00 crc kubenswrapper[4763]: E0930 13:53:00.940066 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b" Sep 30 13:53:00 crc kubenswrapper[4763]: E0930 13:53:00.940491 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b" Sep 30 13:53:00 crc kubenswrapper[4763]: E0930 13:53:00.941094 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plsrt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-d5f6f49c7-bsrnl_openstack(5a941342-6ab0-49b5-9a11-2cbd7d3367d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:00 crc kubenswrapper[4763]: E0930 13:53:00.942727 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" podUID="5a941342-6ab0-49b5-9a11-2cbd7d3367d8" Sep 30 13:53:00 crc kubenswrapper[4763]: E0930 13:53:00.943269 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w29bs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-b6f94bdfc-b4p76_openstack(d5cacb83-c744-4798-afff-0736b0938677): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:00 crc kubenswrapper[4763]: E0930 13:53:00.945371 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" podUID="d5cacb83-c744-4798-afff-0736b0938677" Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.238189 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2c5264e-b119-4444-b954-c33b428294b5","Type":"ContainerStarted","Data":"026035cf8cc1caa067dae1549ea6b96a1ec29587d04378d9dc2ddf53c3015c3a"} Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.411869 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 13:53:01 crc kubenswrapper[4763]: W0930 13:53:01.440997 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b7af94e_accb_45ca_af30_c489c8d77b12.slice/crio-3fa32050d777708047fe02f0901ecc0ed4c915235281a6ed52411d21b1bdd265 WatchSource:0}: Error finding container 3fa32050d777708047fe02f0901ecc0ed4c915235281a6ed52411d21b1bdd265: Status 404 returned error can't find the container with id 3fa32050d777708047fe02f0901ecc0ed4c915235281a6ed52411d21b1bdd265 Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.491854 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kwz5v"] Sep 30 13:53:01 crc kubenswrapper[4763]: W0930 13:53:01.494752 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1db73295_0655_443c_91e0_2cd08b119141.slice/crio-9a3167b2cef40c711d40ef58feda0d96b9fafe2cc7ae9998185de46719f43773 WatchSource:0}: Error finding container 9a3167b2cef40c711d40ef58feda0d96b9fafe2cc7ae9998185de46719f43773: Status 404 returned error can't find the container with id 9a3167b2cef40c711d40ef58feda0d96b9fafe2cc7ae9998185de46719f43773 Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.518555 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-72z5c"] Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.575124 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.582304 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 13:53:01 crc kubenswrapper[4763]: W0930 13:53:01.592854 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4412eaea_f645_451a_8b88_c562357c6b1e.slice/crio-a76224d57b415530c39bb22b5ef2b9e213b179a97d3acc51904ce4f30b9cd44b WatchSource:0}: Error finding container a76224d57b415530c39bb22b5ef2b9e213b179a97d3acc51904ce4f30b9cd44b: Status 404 returned error can't find the container with id a76224d57b415530c39bb22b5ef2b9e213b179a97d3acc51904ce4f30b9cd44b Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.668812 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 13:53:01 crc kubenswrapper[4763]: W0930 13:53:01.689707 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3cc8ad7_1903_4a9f_94a4_a84f47cd1189.slice/crio-9a25e9cdb67390ac3b4b5e800b4a3484640138398471ec6560b2922d885c8434 WatchSource:0}: Error finding container 9a25e9cdb67390ac3b4b5e800b4a3484640138398471ec6560b2922d885c8434: Status 404 returned error can't find the container with id 9a25e9cdb67390ac3b4b5e800b4a3484640138398471ec6560b2922d885c8434 Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.693359 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.769323 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 13:53:01 crc kubenswrapper[4763]: W0930 13:53:01.774259 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb611e133_1d4a_49a8_9632_bdb825d41fa4.slice/crio-683e72551225fb42a0e8bddfa2b7dd515124d23141de7400e8111c20c09509bf WatchSource:0}: Error finding container 683e72551225fb42a0e8bddfa2b7dd515124d23141de7400e8111c20c09509bf: Status 404 returned error can't find the container with id 683e72551225fb42a0e8bddfa2b7dd515124d23141de7400e8111c20c09509bf Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.789377 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plsrt\" (UniqueName: \"kubernetes.io/projected/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-kube-api-access-plsrt\") pod \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.789458 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-config\") pod \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.789559 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-dns-svc\") pod \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\" (UID: \"5a941342-6ab0-49b5-9a11-2cbd7d3367d8\") " Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.789870 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-config" (OuterVolumeSpecName: "config") pod "5a941342-6ab0-49b5-9a11-2cbd7d3367d8" (UID: "5a941342-6ab0-49b5-9a11-2cbd7d3367d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.790052 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.790193 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5a941342-6ab0-49b5-9a11-2cbd7d3367d8" (UID: "5a941342-6ab0-49b5-9a11-2cbd7d3367d8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.794414 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-kube-api-access-plsrt" (OuterVolumeSpecName: "kube-api-access-plsrt") pod "5a941342-6ab0-49b5-9a11-2cbd7d3367d8" (UID: "5a941342-6ab0-49b5-9a11-2cbd7d3367d8"). InnerVolumeSpecName "kube-api-access-plsrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.891328 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plsrt\" (UniqueName: \"kubernetes.io/projected/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-kube-api-access-plsrt\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:01 crc kubenswrapper[4763]: I0930 13:53:01.891377 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a941342-6ab0-49b5-9a11-2cbd7d3367d8-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.248312 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3119638a-6580-4a24-8e7f-40f7f7d788a5","Type":"ContainerStarted","Data":"f4cd7078c0ddc04e2c8e6651fa5ad9a35e37bd449097a1bdf9256ab5a071cf30"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.253651 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b7af94e-accb-45ca-af30-c489c8d77b12","Type":"ContainerStarted","Data":"3fa32050d777708047fe02f0901ecc0ed4c915235281a6ed52411d21b1bdd265"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.255648 4763 generic.go:334] "Generic (PLEG): container finished" podID="899b170a-9f2f-4275-afcc-a78446c89728" containerID="3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae" exitCode=0 Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.255715 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" event={"ID":"899b170a-9f2f-4275-afcc-a78446c89728","Type":"ContainerDied","Data":"3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.257303 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-72z5c" event={"ID":"08cae05d-3853-4e7a-a66c-380c023d086b","Type":"ContainerStarted","Data":"a70cd39f6185e0831b184e524834c5fcd081e8ec637941345560d79748162292"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.258729 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.258729 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5f6f49c7-bsrnl" event={"ID":"5a941342-6ab0-49b5-9a11-2cbd7d3367d8","Type":"ContainerDied","Data":"6acebec9c0bbe8c3c5b4bda39e7b53eb98bc0541024dcf97a90f303fffb035aa"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.264924 4763 generic.go:334] "Generic (PLEG): container finished" podID="d5cacb83-c744-4798-afff-0736b0938677" containerID="1657059944b9a24b3f1593dddd3a5847faabcfcfc089c3e8643cd03210c5153f" exitCode=0 Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.265021 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" event={"ID":"d5cacb83-c744-4798-afff-0736b0938677","Type":"ContainerDied","Data":"1657059944b9a24b3f1593dddd3a5847faabcfcfc089c3e8643cd03210c5153f"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.275819 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189","Type":"ContainerStarted","Data":"9a25e9cdb67390ac3b4b5e800b4a3484640138398471ec6560b2922d885c8434"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.277326 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kwz5v" event={"ID":"1db73295-0655-443c-91e0-2cd08b119141","Type":"ContainerStarted","Data":"9a3167b2cef40c711d40ef58feda0d96b9fafe2cc7ae9998185de46719f43773"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.278809 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4412eaea-f645-451a-8b88-c562357c6b1e","Type":"ContainerStarted","Data":"a76224d57b415530c39bb22b5ef2b9e213b179a97d3acc51904ce4f30b9cd44b"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.279718 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e5f7940e-dedf-45a0-97b4-dc825dc00fc5","Type":"ContainerStarted","Data":"d5bdae3e0963d01b6c4265fdebe04dcdd44b7df16dbb249020ed0328adfb9388"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.280793 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b611e133-1d4a-49a8-9632-bdb825d41fa4","Type":"ContainerStarted","Data":"683e72551225fb42a0e8bddfa2b7dd515124d23141de7400e8111c20c09509bf"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.282001 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c","Type":"ContainerStarted","Data":"ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707"} Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.368855 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-bsrnl"] Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.381536 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-bsrnl"] Sep 30 13:53:02 crc kubenswrapper[4763]: I0930 13:53:02.503218 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a941342-6ab0-49b5-9a11-2cbd7d3367d8" path="/var/lib/kubelet/pods/5a941342-6ab0-49b5-9a11-2cbd7d3367d8/volumes" Sep 30 13:53:04 crc kubenswrapper[4763]: I0930 13:53:04.305563 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" event={"ID":"d5cacb83-c744-4798-afff-0736b0938677","Type":"ContainerStarted","Data":"2200b27a0c98cdd54e40e964cff5f84258aa1887b71943ac0505a2e73a8d2ec5"} Sep 30 13:53:04 crc kubenswrapper[4763]: I0930 13:53:04.306147 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:53:04 crc kubenswrapper[4763]: I0930 13:53:04.353628 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" podStartSLOduration=-9223372016.501186 podStartE2EDuration="20.353590024s" podCreationTimestamp="2025-09-30 13:52:44 +0000 UTC" firstStartedPulling="2025-09-30 13:52:45.669381819 +0000 UTC m=+1037.807942104" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:53:04.351128782 +0000 UTC m=+1056.489689077" watchObservedRunningTime="2025-09-30 13:53:04.353590024 +0000 UTC m=+1056.492150319" Sep 30 13:53:06 crc kubenswrapper[4763]: I0930 13:53:06.060199 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:53:06 crc kubenswrapper[4763]: I0930 13:53:06.060269 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:53:07 crc kubenswrapper[4763]: I0930 13:53:07.331006 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" event={"ID":"899b170a-9f2f-4275-afcc-a78446c89728","Type":"ContainerStarted","Data":"7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f"} Sep 30 13:53:07 crc kubenswrapper[4763]: I0930 13:53:07.331320 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:53:07 crc kubenswrapper[4763]: I0930 13:53:07.349985 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" podStartSLOduration=7.194476974 podStartE2EDuration="22.349968777s" podCreationTimestamp="2025-09-30 13:52:45 +0000 UTC" firstStartedPulling="2025-09-30 13:52:45.918928787 +0000 UTC m=+1038.057489072" lastFinishedPulling="2025-09-30 13:53:01.07442059 +0000 UTC m=+1053.212980875" observedRunningTime="2025-09-30 13:53:07.347870514 +0000 UTC m=+1059.486430829" watchObservedRunningTime="2025-09-30 13:53:07.349968777 +0000 UTC m=+1059.488529062" Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.350483 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2c5264e-b119-4444-b954-c33b428294b5","Type":"ContainerStarted","Data":"f6e4f42f53a2bc5ce14b714651684ec859fa6c8501a49e587b1165761bb42457"} Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.352664 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189","Type":"ContainerStarted","Data":"780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593"} Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.362885 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kwz5v" event={"ID":"1db73295-0655-443c-91e0-2cd08b119141","Type":"ContainerStarted","Data":"6929f8af8d3cf797dc4b407e18a7a6d4c22dc654105d94f4fa1d84446a16b519"} Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.363144 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-kwz5v" Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.365698 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b7af94e-accb-45ca-af30-c489c8d77b12","Type":"ContainerStarted","Data":"19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399"} Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.368760 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4412eaea-f645-451a-8b88-c562357c6b1e","Type":"ContainerStarted","Data":"0d0112a1787094253153b3a60f8663407e6cb545baa23acb0b8b21ec9335b321"} Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.369249 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.382316 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-72z5c" event={"ID":"08cae05d-3853-4e7a-a66c-380c023d086b","Type":"ContainerDied","Data":"62bc8ec1bc27fbde74a1f9c030003027564bbbd612c1161e410a2129b5bc1b90"} Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.382744 4763 generic.go:334] "Generic (PLEG): container finished" podID="08cae05d-3853-4e7a-a66c-380c023d086b" containerID="62bc8ec1bc27fbde74a1f9c030003027564bbbd612c1161e410a2129b5bc1b90" exitCode=0 Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.386451 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e5f7940e-dedf-45a0-97b4-dc825dc00fc5","Type":"ContainerStarted","Data":"8d9e3ab86dc859f16e88025097f97a98ba29d69374fe2446837e45205f560afd"} Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.386612 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.389521 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b611e133-1d4a-49a8-9632-bdb825d41fa4","Type":"ContainerStarted","Data":"f076032ba256059553984a2d073b2dcc74aadf98fe54ecddda41aaee3f716c6e"} Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.404638 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.665816661000001 podStartE2EDuration="19.404581201s" podCreationTimestamp="2025-09-30 13:52:50 +0000 UTC" firstStartedPulling="2025-09-30 13:53:01.603855044 +0000 UTC m=+1053.742415329" lastFinishedPulling="2025-09-30 13:53:08.342619574 +0000 UTC m=+1060.481179869" observedRunningTime="2025-09-30 13:53:09.39216184 +0000 UTC m=+1061.530722135" watchObservedRunningTime="2025-09-30 13:53:09.404581201 +0000 UTC m=+1061.543141486" Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.419626 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-kwz5v" podStartSLOduration=8.6852656 podStartE2EDuration="15.419577268s" podCreationTimestamp="2025-09-30 13:52:54 +0000 UTC" firstStartedPulling="2025-09-30 13:53:01.500791596 +0000 UTC m=+1053.639351881" lastFinishedPulling="2025-09-30 13:53:08.235103264 +0000 UTC m=+1060.373663549" observedRunningTime="2025-09-30 13:53:09.412426728 +0000 UTC m=+1061.550987023" watchObservedRunningTime="2025-09-30 13:53:09.419577268 +0000 UTC m=+1061.558137543" Sep 30 13:53:09 crc kubenswrapper[4763]: I0930 13:53:09.460111 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.21729093 podStartE2EDuration="20.458346692s" podCreationTimestamp="2025-09-30 13:52:49 +0000 UTC" firstStartedPulling="2025-09-30 13:53:01.60407332 +0000 UTC m=+1053.742633615" lastFinishedPulling="2025-09-30 13:53:07.845129092 +0000 UTC m=+1059.983689377" observedRunningTime="2025-09-30 13:53:09.457181242 +0000 UTC m=+1061.595741527" watchObservedRunningTime="2025-09-30 13:53:09.458346692 +0000 UTC m=+1061.596906977" Sep 30 13:53:10 crc kubenswrapper[4763]: I0930 13:53:10.182929 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:53:10 crc kubenswrapper[4763]: I0930 13:53:10.399577 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-72z5c" event={"ID":"08cae05d-3853-4e7a-a66c-380c023d086b","Type":"ContainerStarted","Data":"276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264"} Sep 30 13:53:14 crc kubenswrapper[4763]: I0930 13:53:14.645897 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 13:53:15 crc kubenswrapper[4763]: I0930 13:53:15.496503 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:53:15 crc kubenswrapper[4763]: I0930 13:53:15.549559 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-b4p76"] Sep 30 13:53:15 crc kubenswrapper[4763]: I0930 13:53:15.549829 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" podUID="d5cacb83-c744-4798-afff-0736b0938677" containerName="dnsmasq-dns" containerID="cri-o://2200b27a0c98cdd54e40e964cff5f84258aa1887b71943ac0505a2e73a8d2ec5" gracePeriod=10 Sep 30 13:53:16 crc kubenswrapper[4763]: I0930 13:53:16.444409 4763 generic.go:334] "Generic (PLEG): container finished" podID="d5cacb83-c744-4798-afff-0736b0938677" containerID="2200b27a0c98cdd54e40e964cff5f84258aa1887b71943ac0505a2e73a8d2ec5" exitCode=0 Sep 30 13:53:16 crc kubenswrapper[4763]: I0930 13:53:16.444474 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" event={"ID":"d5cacb83-c744-4798-afff-0736b0938677","Type":"ContainerDied","Data":"2200b27a0c98cdd54e40e964cff5f84258aa1887b71943ac0505a2e73a8d2ec5"} Sep 30 13:53:16 crc kubenswrapper[4763]: I0930 13:53:16.446670 4763 generic.go:334] "Generic (PLEG): container finished" podID="e2c5264e-b119-4444-b954-c33b428294b5" containerID="f6e4f42f53a2bc5ce14b714651684ec859fa6c8501a49e587b1165761bb42457" exitCode=0 Sep 30 13:53:16 crc kubenswrapper[4763]: I0930 13:53:16.446765 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2c5264e-b119-4444-b954-c33b428294b5","Type":"ContainerDied","Data":"f6e4f42f53a2bc5ce14b714651684ec859fa6c8501a49e587b1165761bb42457"} Sep 30 13:53:16 crc kubenswrapper[4763]: I0930 13:53:16.449173 4763 generic.go:334] "Generic (PLEG): container finished" podID="2b7af94e-accb-45ca-af30-c489c8d77b12" containerID="19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399" exitCode=0 Sep 30 13:53:16 crc kubenswrapper[4763]: I0930 13:53:16.449247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b7af94e-accb-45ca-af30-c489c8d77b12","Type":"ContainerDied","Data":"19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399"} Sep 30 13:53:17 crc kubenswrapper[4763]: I0930 13:53:17.804702 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:53:17 crc kubenswrapper[4763]: I0930 13:53:17.867501 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-config\") pod \"d5cacb83-c744-4798-afff-0736b0938677\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " Sep 30 13:53:17 crc kubenswrapper[4763]: I0930 13:53:17.867558 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w29bs\" (UniqueName: \"kubernetes.io/projected/d5cacb83-c744-4798-afff-0736b0938677-kube-api-access-w29bs\") pod \"d5cacb83-c744-4798-afff-0736b0938677\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " Sep 30 13:53:17 crc kubenswrapper[4763]: I0930 13:53:17.867629 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-dns-svc\") pod \"d5cacb83-c744-4798-afff-0736b0938677\" (UID: \"d5cacb83-c744-4798-afff-0736b0938677\") " Sep 30 13:53:17 crc kubenswrapper[4763]: I0930 13:53:17.872743 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5cacb83-c744-4798-afff-0736b0938677-kube-api-access-w29bs" (OuterVolumeSpecName: "kube-api-access-w29bs") pod "d5cacb83-c744-4798-afff-0736b0938677" (UID: "d5cacb83-c744-4798-afff-0736b0938677"). InnerVolumeSpecName "kube-api-access-w29bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:17 crc kubenswrapper[4763]: I0930 13:53:17.986788 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w29bs\" (UniqueName: \"kubernetes.io/projected/d5cacb83-c744-4798-afff-0736b0938677-kube-api-access-w29bs\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:17 crc kubenswrapper[4763]: I0930 13:53:17.988883 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-config" (OuterVolumeSpecName: "config") pod "d5cacb83-c744-4798-afff-0736b0938677" (UID: "d5cacb83-c744-4798-afff-0736b0938677"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:17.990237 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5cacb83-c744-4798-afff-0736b0938677" (UID: "d5cacb83-c744-4798-afff-0736b0938677"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:17.994957 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-djfwj"] Sep 30 13:53:18 crc kubenswrapper[4763]: E0930 13:53:17.995629 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5cacb83-c744-4798-afff-0736b0938677" containerName="init" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:17.995644 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5cacb83-c744-4798-afff-0736b0938677" containerName="init" Sep 30 13:53:18 crc kubenswrapper[4763]: E0930 13:53:17.995665 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5cacb83-c744-4798-afff-0736b0938677" containerName="dnsmasq-dns" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:17.995671 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5cacb83-c744-4798-afff-0736b0938677" containerName="dnsmasq-dns" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:17.995827 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5cacb83-c744-4798-afff-0736b0938677" containerName="dnsmasq-dns" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:17.996936 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.000476 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.002709 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-djfwj"] Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.088145 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovn-rundir\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.088253 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03f3de76-2dd7-4d26-8010-72d5ff408190-config\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.088301 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-575xr\" (UniqueName: \"kubernetes.io/projected/03f3de76-2dd7-4d26-8010-72d5ff408190-kube-api-access-575xr\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.088329 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-combined-ca-bundle\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.088365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.088390 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovs-rundir\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.088490 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.088505 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5cacb83-c744-4798-afff-0736b0938677-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.134193 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86b869995c-94hsg"] Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.137439 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.139088 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.149814 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b869995c-94hsg"] Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190013 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03f3de76-2dd7-4d26-8010-72d5ff408190-config\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190068 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-dns-svc\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190125 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-575xr\" (UniqueName: \"kubernetes.io/projected/03f3de76-2dd7-4d26-8010-72d5ff408190-kube-api-access-575xr\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190155 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-combined-ca-bundle\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190179 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w9bp\" (UniqueName: \"kubernetes.io/projected/3d5d15da-c476-4e5b-91fe-41e005899b64-kube-api-access-4w9bp\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190208 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190230 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovs-rundir\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190263 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-config\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190326 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-ovsdbserver-nb\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190414 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovn-rundir\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190734 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovs-rundir\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190743 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovn-rundir\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.190977 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03f3de76-2dd7-4d26-8010-72d5ff408190-config\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.196324 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.197337 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-combined-ca-bundle\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.208754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-575xr\" (UniqueName: \"kubernetes.io/projected/03f3de76-2dd7-4d26-8010-72d5ff408190-kube-api-access-575xr\") pod \"ovn-controller-metrics-djfwj\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.275265 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b869995c-94hsg"] Sep 30 13:53:18 crc kubenswrapper[4763]: E0930 13:53:18.275903 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-4w9bp ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-86b869995c-94hsg" podUID="3d5d15da-c476-4e5b-91fe-41e005899b64" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.291339 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w9bp\" (UniqueName: \"kubernetes.io/projected/3d5d15da-c476-4e5b-91fe-41e005899b64-kube-api-access-4w9bp\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.291402 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-config\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.291439 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-ovsdbserver-nb\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.291548 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-dns-svc\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.292449 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-ovsdbserver-nb\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.292449 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-config\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.292557 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-dns-svc\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.295331 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-jkmbd"] Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.296570 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.299892 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.311455 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w9bp\" (UniqueName: \"kubernetes.io/projected/3d5d15da-c476-4e5b-91fe-41e005899b64-kube-api-access-4w9bp\") pod \"dnsmasq-dns-86b869995c-94hsg\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.316027 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-jkmbd"] Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.326028 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.392530 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-sb\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.392657 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-config\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.392676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-nb\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.392695 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-dns-svc\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.392748 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxncz\" (UniqueName: \"kubernetes.io/projected/9c80e6d4-9dd4-48d9-8014-65cdf079d153-kube-api-access-lxncz\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.469724 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b7af94e-accb-45ca-af30-c489c8d77b12","Type":"ContainerStarted","Data":"5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97"} Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.474334 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-72z5c" event={"ID":"08cae05d-3853-4e7a-a66c-380c023d086b","Type":"ContainerStarted","Data":"0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145"} Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.474456 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.474473 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.476122 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b611e133-1d4a-49a8-9632-bdb825d41fa4","Type":"ContainerStarted","Data":"ecb99a945c2d9b0bf36ec0c4004dd06419870d85ada8f6a288f0b829f450fa4d"} Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.478725 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" event={"ID":"d5cacb83-c744-4798-afff-0736b0938677","Type":"ContainerDied","Data":"e8a33c2037b714d97bcc5c7407386b540defb37e4f0cab2ed73326f203b4fff2"} Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.478756 4763 scope.go:117] "RemoveContainer" containerID="2200b27a0c98cdd54e40e964cff5f84258aa1887b71943ac0505a2e73a8d2ec5" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.478774 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-b4p76" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.481247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2c5264e-b119-4444-b954-c33b428294b5","Type":"ContainerStarted","Data":"a4f61c64a8df3d9915add4b261e934b36d4aae625742c1aa68894904c7c207d4"} Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.483045 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189","Type":"ContainerStarted","Data":"823878f5a22f30c2add397afa8ccc5fd623a8d47193a01552f23a33548ef8021"} Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.483054 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.495995 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-sb\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.496099 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-config\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.496123 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-nb\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.496143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-dns-svc\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.496197 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxncz\" (UniqueName: \"kubernetes.io/projected/9c80e6d4-9dd4-48d9-8014-65cdf079d153-kube-api-access-lxncz\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.497175 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-sb\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.499150 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-config\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.500052 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-dns-svc\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.500090 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.503041 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-nb\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.510683 4763 scope.go:117] "RemoveContainer" containerID="1657059944b9a24b3f1593dddd3a5847faabcfcfc089c3e8643cd03210c5153f" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.517977 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.600794989 podStartE2EDuration="31.51796101s" podCreationTimestamp="2025-09-30 13:52:47 +0000 UTC" firstStartedPulling="2025-09-30 13:53:00.927956621 +0000 UTC m=+1053.066516906" lastFinishedPulling="2025-09-30 13:53:07.845122642 +0000 UTC m=+1059.983682927" observedRunningTime="2025-09-30 13:53:18.512672738 +0000 UTC m=+1070.651233023" watchObservedRunningTime="2025-09-30 13:53:18.51796101 +0000 UTC m=+1070.656521295" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.524517 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxncz\" (UniqueName: \"kubernetes.io/projected/9c80e6d4-9dd4-48d9-8014-65cdf079d153-kube-api-access-lxncz\") pod \"dnsmasq-dns-5d86d68bf7-jkmbd\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.541964 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.751692429 podStartE2EDuration="31.541943922s" podCreationTimestamp="2025-09-30 13:52:47 +0000 UTC" firstStartedPulling="2025-09-30 13:53:01.444642476 +0000 UTC m=+1053.583202761" lastFinishedPulling="2025-09-30 13:53:08.234893969 +0000 UTC m=+1060.373454254" observedRunningTime="2025-09-30 13:53:18.495709481 +0000 UTC m=+1070.634269756" watchObservedRunningTime="2025-09-30 13:53:18.541943922 +0000 UTC m=+1070.680504207" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.573939 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.507474164 podStartE2EDuration="25.573914355s" podCreationTimestamp="2025-09-30 13:52:53 +0000 UTC" firstStartedPulling="2025-09-30 13:53:01.692187622 +0000 UTC m=+1053.830747907" lastFinishedPulling="2025-09-30 13:53:17.758627813 +0000 UTC m=+1069.897188098" observedRunningTime="2025-09-30 13:53:18.551170724 +0000 UTC m=+1070.689731009" watchObservedRunningTime="2025-09-30 13:53:18.573914355 +0000 UTC m=+1070.712474640" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.591874 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-72z5c" podStartSLOduration=17.893181763 podStartE2EDuration="24.591852916s" podCreationTimestamp="2025-09-30 13:52:54 +0000 UTC" firstStartedPulling="2025-09-30 13:53:01.533966299 +0000 UTC m=+1053.672526574" lastFinishedPulling="2025-09-30 13:53:08.232637442 +0000 UTC m=+1060.371197727" observedRunningTime="2025-09-30 13:53:18.58443065 +0000 UTC m=+1070.722990955" watchObservedRunningTime="2025-09-30 13:53:18.591852916 +0000 UTC m=+1070.730413201" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.597845 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-dns-svc\") pod \"3d5d15da-c476-4e5b-91fe-41e005899b64\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.597929 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-config\") pod \"3d5d15da-c476-4e5b-91fe-41e005899b64\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.598087 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w9bp\" (UniqueName: \"kubernetes.io/projected/3d5d15da-c476-4e5b-91fe-41e005899b64-kube-api-access-4w9bp\") pod \"3d5d15da-c476-4e5b-91fe-41e005899b64\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.598154 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-ovsdbserver-nb\") pod \"3d5d15da-c476-4e5b-91fe-41e005899b64\" (UID: \"3d5d15da-c476-4e5b-91fe-41e005899b64\") " Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.598383 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d5d15da-c476-4e5b-91fe-41e005899b64" (UID: "3d5d15da-c476-4e5b-91fe-41e005899b64"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.598652 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d5d15da-c476-4e5b-91fe-41e005899b64" (UID: "3d5d15da-c476-4e5b-91fe-41e005899b64"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.599815 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-config" (OuterVolumeSpecName: "config") pod "3d5d15da-c476-4e5b-91fe-41e005899b64" (UID: "3d5d15da-c476-4e5b-91fe-41e005899b64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.601799 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.601900 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.601990 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d5d15da-c476-4e5b-91fe-41e005899b64-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.613197 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5d15da-c476-4e5b-91fe-41e005899b64-kube-api-access-4w9bp" (OuterVolumeSpecName: "kube-api-access-4w9bp") pod "3d5d15da-c476-4e5b-91fe-41e005899b64" (UID: "3d5d15da-c476-4e5b-91fe-41e005899b64"). InnerVolumeSpecName "kube-api-access-4w9bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.617513 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.638942 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.628804642 podStartE2EDuration="22.638918368s" podCreationTimestamp="2025-09-30 13:52:56 +0000 UTC" firstStartedPulling="2025-09-30 13:53:01.776179922 +0000 UTC m=+1053.914740207" lastFinishedPulling="2025-09-30 13:53:17.786293648 +0000 UTC m=+1069.924853933" observedRunningTime="2025-09-30 13:53:18.602388791 +0000 UTC m=+1070.740949076" watchObservedRunningTime="2025-09-30 13:53:18.638918368 +0000 UTC m=+1070.777478653" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.663139 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-b4p76"] Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.669972 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-b4p76"] Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.703888 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w9bp\" (UniqueName: \"kubernetes.io/projected/3d5d15da-c476-4e5b-91fe-41e005899b64-kube-api-access-4w9bp\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:18 crc kubenswrapper[4763]: I0930 13:53:18.824486 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-djfwj"] Sep 30 13:53:18 crc kubenswrapper[4763]: W0930 13:53:18.826333 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03f3de76_2dd7_4d26_8010_72d5ff408190.slice/crio-fa8785e9a555620e38e2aa467798a883b6056a1798c9541327a357d6dcbc638e WatchSource:0}: Error finding container fa8785e9a555620e38e2aa467798a883b6056a1798c9541327a357d6dcbc638e: Status 404 returned error can't find the container with id fa8785e9a555620e38e2aa467798a883b6056a1798c9541327a357d6dcbc638e Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.066187 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-jkmbd"] Sep 30 13:53:19 crc kubenswrapper[4763]: W0930 13:53:19.071960 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c80e6d4_9dd4_48d9_8014_65cdf079d153.slice/crio-370c843597c9449d591e96cea65b36176a4ec72122ea781bd4035a550689b5b2 WatchSource:0}: Error finding container 370c843597c9449d591e96cea65b36176a4ec72122ea781bd4035a550689b5b2: Status 404 returned error can't find the container with id 370c843597c9449d591e96cea65b36176a4ec72122ea781bd4035a550689b5b2 Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.072263 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.072382 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.088712 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.088757 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.234081 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.276682 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.341967 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.388836 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.492425 4763 generic.go:334] "Generic (PLEG): container finished" podID="9c80e6d4-9dd4-48d9-8014-65cdf079d153" containerID="e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63" exitCode=0 Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.492497 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" event={"ID":"9c80e6d4-9dd4-48d9-8014-65cdf079d153","Type":"ContainerDied","Data":"e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63"} Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.492520 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" event={"ID":"9c80e6d4-9dd4-48d9-8014-65cdf079d153","Type":"ContainerStarted","Data":"370c843597c9449d591e96cea65b36176a4ec72122ea781bd4035a550689b5b2"} Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.497328 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-djfwj" event={"ID":"03f3de76-2dd7-4d26-8010-72d5ff408190","Type":"ContainerStarted","Data":"8a278d3c4ace4c4d3804e473924d1b56cd571d2b8cdd048d77caed140d79f478"} Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.497376 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-djfwj" event={"ID":"03f3de76-2dd7-4d26-8010-72d5ff408190","Type":"ContainerStarted","Data":"fa8785e9a555620e38e2aa467798a883b6056a1798c9541327a357d6dcbc638e"} Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.497426 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b869995c-94hsg" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.499126 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.499184 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.542461 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-djfwj" podStartSLOduration=2.542442207 podStartE2EDuration="2.542442207s" podCreationTimestamp="2025-09-30 13:53:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:53:19.541575325 +0000 UTC m=+1071.680135620" watchObservedRunningTime="2025-09-30 13:53:19.542442207 +0000 UTC m=+1071.681002492" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.571258 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.578221 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.689178 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b869995c-94hsg"] Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.697529 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86b869995c-94hsg"] Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.904295 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.905623 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.909348 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.909531 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-wjj8w" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.909714 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.914007 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 13:53:19 crc kubenswrapper[4763]: I0930 13:53:19.922416 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.034517 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.034619 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-scripts\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.034648 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.034696 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.034754 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.034781 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7qll\" (UniqueName: \"kubernetes.io/projected/916727b2-6488-4edf-b33b-c5908eae0e41-kube-api-access-s7qll\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.034815 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-config\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.136726 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-scripts\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.136773 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.136828 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.136863 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.136908 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7qll\" (UniqueName: \"kubernetes.io/projected/916727b2-6488-4edf-b33b-c5908eae0e41-kube-api-access-s7qll\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.136949 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-config\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.136980 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.137716 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.138287 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-scripts\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.138377 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-config\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.141015 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.141576 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.152760 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.156722 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7qll\" (UniqueName: \"kubernetes.io/projected/916727b2-6488-4edf-b33b-c5908eae0e41-kube-api-access-s7qll\") pod \"ovn-northd-0\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.240926 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.499008 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5d15da-c476-4e5b-91fe-41e005899b64" path="/var/lib/kubelet/pods/3d5d15da-c476-4e5b-91fe-41e005899b64/volumes" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.499677 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5cacb83-c744-4798-afff-0736b0938677" path="/var/lib/kubelet/pods/d5cacb83-c744-4798-afff-0736b0938677/volumes" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.504616 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" event={"ID":"9c80e6d4-9dd4-48d9-8014-65cdf079d153","Type":"ContainerStarted","Data":"bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c"} Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.522800 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" podStartSLOduration=2.522778524 podStartE2EDuration="2.522778524s" podCreationTimestamp="2025-09-30 13:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:53:20.521199515 +0000 UTC m=+1072.659759800" watchObservedRunningTime="2025-09-30 13:53:20.522778524 +0000 UTC m=+1072.661338809" Sep 30 13:53:20 crc kubenswrapper[4763]: I0930 13:53:20.697730 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 13:53:20 crc kubenswrapper[4763]: W0930 13:53:20.713719 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod916727b2_6488_4edf_b33b_c5908eae0e41.slice/crio-ff96488141df2f82b229455031c651a019c33c792c9a917bfaa2b10e83bc49b3 WatchSource:0}: Error finding container ff96488141df2f82b229455031c651a019c33c792c9a917bfaa2b10e83bc49b3: Status 404 returned error can't find the container with id ff96488141df2f82b229455031c651a019c33c792c9a917bfaa2b10e83bc49b3 Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.195029 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.297356 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-jkmbd"] Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.336630 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-slwbw"] Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.337843 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.365425 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-slwbw"] Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.460094 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-config\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.460496 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.460649 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.460777 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l5hx\" (UniqueName: \"kubernetes.io/projected/9a57f6d1-c09b-410a-af3c-8b3a010da11a-kube-api-access-5l5hx\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.460938 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-dns-svc\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.520034 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"916727b2-6488-4edf-b33b-c5908eae0e41","Type":"ContainerStarted","Data":"ff96488141df2f82b229455031c651a019c33c792c9a917bfaa2b10e83bc49b3"} Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.520074 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.562518 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-dns-svc\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.563316 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-config\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.563416 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-dns-svc\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.563861 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.563905 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.563935 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l5hx\" (UniqueName: \"kubernetes.io/projected/9a57f6d1-c09b-410a-af3c-8b3a010da11a-kube-api-access-5l5hx\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.564236 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-config\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.564474 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.564548 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.584420 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l5hx\" (UniqueName: \"kubernetes.io/projected/9a57f6d1-c09b-410a-af3c-8b3a010da11a-kube-api-access-5l5hx\") pod \"dnsmasq-dns-6c6d5d5bd7-slwbw\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:21 crc kubenswrapper[4763]: I0930 13:53:21.665038 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.139467 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-slwbw"] Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.472773 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.479434 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.482642 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.482871 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gzb25" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.483037 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.483755 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.504981 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.537493 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"916727b2-6488-4edf-b33b-c5908eae0e41","Type":"ContainerStarted","Data":"777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b"} Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.537542 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"916727b2-6488-4edf-b33b-c5908eae0e41","Type":"ContainerStarted","Data":"e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737"} Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.537583 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.540116 4763 generic.go:334] "Generic (PLEG): container finished" podID="9a57f6d1-c09b-410a-af3c-8b3a010da11a" containerID="f7ddd2b0fb1d49dfc38face49b6897b54965d13f08f04b6ddbd116bb6356c07b" exitCode=0 Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.540280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" event={"ID":"9a57f6d1-c09b-410a-af3c-8b3a010da11a","Type":"ContainerDied","Data":"f7ddd2b0fb1d49dfc38face49b6897b54965d13f08f04b6ddbd116bb6356c07b"} Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.540532 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" event={"ID":"9a57f6d1-c09b-410a-af3c-8b3a010da11a","Type":"ContainerStarted","Data":"25ad75ad9fe38cf50b72d806c4759f006f98ae948d7e2d0baf0de5169a082bb8"} Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.540864 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" podUID="9c80e6d4-9dd4-48d9-8014-65cdf079d153" containerName="dnsmasq-dns" containerID="cri-o://bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c" gracePeriod=10 Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.566658 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.19640201 podStartE2EDuration="3.566641689s" podCreationTimestamp="2025-09-30 13:53:19 +0000 UTC" firstStartedPulling="2025-09-30 13:53:20.714902028 +0000 UTC m=+1072.853462313" lastFinishedPulling="2025-09-30 13:53:22.085141707 +0000 UTC m=+1074.223701992" observedRunningTime="2025-09-30 13:53:22.559992691 +0000 UTC m=+1074.698552986" watchObservedRunningTime="2025-09-30 13:53:22.566641689 +0000 UTC m=+1074.705201964" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.583114 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.583210 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-cache\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.583302 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.583377 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-lock\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.583487 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rn4x\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-kube-api-access-6rn4x\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.685080 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rn4x\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-kube-api-access-6rn4x\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.685463 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.685583 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-cache\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.685792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.685939 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-lock\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.686303 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-cache\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.686497 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-lock\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.686592 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: E0930 13:53:22.686658 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 13:53:22 crc kubenswrapper[4763]: E0930 13:53:22.686672 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 13:53:22 crc kubenswrapper[4763]: E0930 13:53:22.686919 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift podName:ba72f8d4-1822-4bb5-a099-c15d4b00b701 nodeName:}" failed. No retries permitted until 2025-09-30 13:53:23.186897059 +0000 UTC m=+1075.325457344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift") pod "swift-storage-0" (UID: "ba72f8d4-1822-4bb5-a099-c15d4b00b701") : configmap "swift-ring-files" not found Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.704996 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rn4x\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-kube-api-access-6rn4x\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.713504 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.989165 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s2d6m"] Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.990909 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.994715 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.994952 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 30 13:53:22 crc kubenswrapper[4763]: I0930 13:53:22.995242 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.022279 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.037560 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s2d6m"] Sep 30 13:53:23 crc kubenswrapper[4763]: E0930 13:53:23.038406 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-mxzzl ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-mxzzl ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-s2d6m" podUID="d3794805-848d-403f-8cac-a2a6b76782c6" Sep 30 13:53:23 crc kubenswrapper[4763]: E0930 13:53:23.083942 4763 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.145:59894->38.129.56.145:42797: write tcp 38.129.56.145:59894->38.129.56.145:42797: write: broken pipe Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.084772 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rjt5b"] Sep 30 13:53:23 crc kubenswrapper[4763]: E0930 13:53:23.085219 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c80e6d4-9dd4-48d9-8014-65cdf079d153" containerName="dnsmasq-dns" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.085235 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c80e6d4-9dd4-48d9-8014-65cdf079d153" containerName="dnsmasq-dns" Sep 30 13:53:23 crc kubenswrapper[4763]: E0930 13:53:23.085252 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c80e6d4-9dd4-48d9-8014-65cdf079d153" containerName="init" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.085268 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c80e6d4-9dd4-48d9-8014-65cdf079d153" containerName="init" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.085450 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c80e6d4-9dd4-48d9-8014-65cdf079d153" containerName="dnsmasq-dns" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.087277 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.098175 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s2d6m"] Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.098697 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-config\") pod \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.098783 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-nb\") pod \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.098830 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxncz\" (UniqueName: \"kubernetes.io/projected/9c80e6d4-9dd4-48d9-8014-65cdf079d153-kube-api-access-lxncz\") pod \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.098934 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-dns-svc\") pod \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.099036 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-sb\") pod \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\" (UID: \"9c80e6d4-9dd4-48d9-8014-65cdf079d153\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.099751 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-swiftconf\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.099809 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d3794805-848d-403f-8cac-a2a6b76782c6-etc-swift\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.099939 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-scripts\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.100018 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxzzl\" (UniqueName: \"kubernetes.io/projected/d3794805-848d-403f-8cac-a2a6b76782c6-kube-api-access-mxzzl\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.100150 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-combined-ca-bundle\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.100249 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-ring-data-devices\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.100483 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-dispersionconf\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.107757 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rjt5b"] Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.117583 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c80e6d4-9dd4-48d9-8014-65cdf079d153-kube-api-access-lxncz" (OuterVolumeSpecName: "kube-api-access-lxncz") pod "9c80e6d4-9dd4-48d9-8014-65cdf079d153" (UID: "9c80e6d4-9dd4-48d9-8014-65cdf079d153"). InnerVolumeSpecName "kube-api-access-lxncz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.160584 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c80e6d4-9dd4-48d9-8014-65cdf079d153" (UID: "9c80e6d4-9dd4-48d9-8014-65cdf079d153"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.168195 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-config" (OuterVolumeSpecName: "config") pod "9c80e6d4-9dd4-48d9-8014-65cdf079d153" (UID: "9c80e6d4-9dd4-48d9-8014-65cdf079d153"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.169968 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c80e6d4-9dd4-48d9-8014-65cdf079d153" (UID: "9c80e6d4-9dd4-48d9-8014-65cdf079d153"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.171827 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c80e6d4-9dd4-48d9-8014-65cdf079d153" (UID: "9c80e6d4-9dd4-48d9-8014-65cdf079d153"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201674 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-ring-data-devices\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-dispersionconf\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201780 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-swiftconf\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201805 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d3794805-848d-403f-8cac-a2a6b76782c6-etc-swift\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201833 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-swiftconf\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201876 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-scripts\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201900 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-combined-ca-bundle\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxzzl\" (UniqueName: \"kubernetes.io/projected/d3794805-848d-403f-8cac-a2a6b76782c6-kube-api-access-mxzzl\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201939 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-dispersionconf\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201959 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nzf5\" (UniqueName: \"kubernetes.io/projected/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-kube-api-access-2nzf5\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.201994 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202013 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-combined-ca-bundle\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202033 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-scripts\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202059 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-etc-swift\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202086 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-ring-data-devices\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202134 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202144 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202155 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxncz\" (UniqueName: \"kubernetes.io/projected/9c80e6d4-9dd4-48d9-8014-65cdf079d153-kube-api-access-lxncz\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202167 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202179 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c80e6d4-9dd4-48d9-8014-65cdf079d153-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-ring-data-devices\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.202902 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d3794805-848d-403f-8cac-a2a6b76782c6-etc-swift\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.203394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-scripts\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: E0930 13:53:23.203779 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 13:53:23 crc kubenswrapper[4763]: E0930 13:53:23.203795 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 13:53:23 crc kubenswrapper[4763]: E0930 13:53:23.203855 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift podName:ba72f8d4-1822-4bb5-a099-c15d4b00b701 nodeName:}" failed. No retries permitted until 2025-09-30 13:53:24.20384175 +0000 UTC m=+1076.342402035 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift") pod "swift-storage-0" (UID: "ba72f8d4-1822-4bb5-a099-c15d4b00b701") : configmap "swift-ring-files" not found Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.206505 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-dispersionconf\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.206726 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-swiftconf\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.207532 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-combined-ca-bundle\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.220434 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxzzl\" (UniqueName: \"kubernetes.io/projected/d3794805-848d-403f-8cac-a2a6b76782c6-kube-api-access-mxzzl\") pod \"swift-ring-rebalance-s2d6m\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.304028 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-combined-ca-bundle\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.304102 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-dispersionconf\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.304137 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nzf5\" (UniqueName: \"kubernetes.io/projected/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-kube-api-access-2nzf5\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.304214 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-scripts\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.304266 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-etc-swift\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.304337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-ring-data-devices\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.304409 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-swiftconf\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.305641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-etc-swift\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.305889 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-scripts\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.306349 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-ring-data-devices\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.308520 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-dispersionconf\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.308678 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-swiftconf\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.313724 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-combined-ca-bundle\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.328424 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nzf5\" (UniqueName: \"kubernetes.io/projected/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-kube-api-access-2nzf5\") pod \"swift-ring-rebalance-rjt5b\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.420868 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.579349 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" event={"ID":"9a57f6d1-c09b-410a-af3c-8b3a010da11a","Type":"ContainerStarted","Data":"c361eaf5f095ffaa66bf4d8a6a114f837e677ebb597ac00b0ddebf3497844457"} Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.580619 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.581320 4763 generic.go:334] "Generic (PLEG): container finished" podID="9c80e6d4-9dd4-48d9-8014-65cdf079d153" containerID="bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c" exitCode=0 Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.581385 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.581394 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" event={"ID":"9c80e6d4-9dd4-48d9-8014-65cdf079d153","Type":"ContainerDied","Data":"bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c"} Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.581429 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-jkmbd" event={"ID":"9c80e6d4-9dd4-48d9-8014-65cdf079d153","Type":"ContainerDied","Data":"370c843597c9449d591e96cea65b36176a4ec72122ea781bd4035a550689b5b2"} Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.581452 4763 scope.go:117] "RemoveContainer" containerID="bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.581543 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.596792 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.615664 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" podStartSLOduration=2.61564315 podStartE2EDuration="2.61564315s" podCreationTimestamp="2025-09-30 13:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:53:23.611056745 +0000 UTC m=+1075.749617030" watchObservedRunningTime="2025-09-30 13:53:23.61564315 +0000 UTC m=+1075.754203435" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.617873 4763 scope.go:117] "RemoveContainer" containerID="e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.657693 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-jkmbd"] Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.666721 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-jkmbd"] Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.698677 4763 scope.go:117] "RemoveContainer" containerID="bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c" Sep 30 13:53:23 crc kubenswrapper[4763]: E0930 13:53:23.699391 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c\": container with ID starting with bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c not found: ID does not exist" containerID="bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.699425 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c"} err="failed to get container status \"bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c\": rpc error: code = NotFound desc = could not find container \"bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c\": container with ID starting with bdc26bf696d8fc9a2b69d2fc74e982c077108aa1af26a0dc8b1f96d341df461c not found: ID does not exist" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.699452 4763 scope.go:117] "RemoveContainer" containerID="e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63" Sep 30 13:53:23 crc kubenswrapper[4763]: E0930 13:53:23.699877 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63\": container with ID starting with e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63 not found: ID does not exist" containerID="e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.699898 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63"} err="failed to get container status \"e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63\": rpc error: code = NotFound desc = could not find container \"e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63\": container with ID starting with e44593d95d86cfe92fb66b4d93cb6b75d7030c8d48a8a19895b6ac4a42f37a63 not found: ID does not exist" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.712743 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-combined-ca-bundle\") pod \"d3794805-848d-403f-8cac-a2a6b76782c6\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.712870 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxzzl\" (UniqueName: \"kubernetes.io/projected/d3794805-848d-403f-8cac-a2a6b76782c6-kube-api-access-mxzzl\") pod \"d3794805-848d-403f-8cac-a2a6b76782c6\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.712898 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-ring-data-devices\") pod \"d3794805-848d-403f-8cac-a2a6b76782c6\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.713031 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-swiftconf\") pod \"d3794805-848d-403f-8cac-a2a6b76782c6\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.713057 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-dispersionconf\") pod \"d3794805-848d-403f-8cac-a2a6b76782c6\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.713115 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-scripts\") pod \"d3794805-848d-403f-8cac-a2a6b76782c6\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.713149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d3794805-848d-403f-8cac-a2a6b76782c6-etc-swift\") pod \"d3794805-848d-403f-8cac-a2a6b76782c6\" (UID: \"d3794805-848d-403f-8cac-a2a6b76782c6\") " Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.715635 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d3794805-848d-403f-8cac-a2a6b76782c6" (UID: "d3794805-848d-403f-8cac-a2a6b76782c6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.715905 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-scripts" (OuterVolumeSpecName: "scripts") pod "d3794805-848d-403f-8cac-a2a6b76782c6" (UID: "d3794805-848d-403f-8cac-a2a6b76782c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.716783 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3794805-848d-403f-8cac-a2a6b76782c6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d3794805-848d-403f-8cac-a2a6b76782c6" (UID: "d3794805-848d-403f-8cac-a2a6b76782c6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.718434 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3794805-848d-403f-8cac-a2a6b76782c6-kube-api-access-mxzzl" (OuterVolumeSpecName: "kube-api-access-mxzzl") pod "d3794805-848d-403f-8cac-a2a6b76782c6" (UID: "d3794805-848d-403f-8cac-a2a6b76782c6"). InnerVolumeSpecName "kube-api-access-mxzzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.718513 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d3794805-848d-403f-8cac-a2a6b76782c6" (UID: "d3794805-848d-403f-8cac-a2a6b76782c6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.718845 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d3794805-848d-403f-8cac-a2a6b76782c6" (UID: "d3794805-848d-403f-8cac-a2a6b76782c6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.719126 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3794805-848d-403f-8cac-a2a6b76782c6" (UID: "d3794805-848d-403f-8cac-a2a6b76782c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.815533 4763 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.815572 4763 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.815586 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.815624 4763 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d3794805-848d-403f-8cac-a2a6b76782c6-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.815637 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3794805-848d-403f-8cac-a2a6b76782c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.815650 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxzzl\" (UniqueName: \"kubernetes.io/projected/d3794805-848d-403f-8cac-a2a6b76782c6-kube-api-access-mxzzl\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:23 crc kubenswrapper[4763]: I0930 13:53:23.815664 4763 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d3794805-848d-403f-8cac-a2a6b76782c6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:24 crc kubenswrapper[4763]: W0930 13:53:24.013309 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dfc595b_d82a_432b_aae3_fbde4c86b6d9.slice/crio-7275b433aa375ebfde3f8a8729a9be1405851843beb85604cdec2f7dec9c3c3b WatchSource:0}: Error finding container 7275b433aa375ebfde3f8a8729a9be1405851843beb85604cdec2f7dec9c3c3b: Status 404 returned error can't find the container with id 7275b433aa375ebfde3f8a8729a9be1405851843beb85604cdec2f7dec9c3c3b Sep 30 13:53:24 crc kubenswrapper[4763]: I0930 13:53:24.015852 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rjt5b"] Sep 30 13:53:24 crc kubenswrapper[4763]: I0930 13:53:24.220819 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:24 crc kubenswrapper[4763]: E0930 13:53:24.221009 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 13:53:24 crc kubenswrapper[4763]: E0930 13:53:24.221026 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 13:53:24 crc kubenswrapper[4763]: E0930 13:53:24.221082 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift podName:ba72f8d4-1822-4bb5-a099-c15d4b00b701 nodeName:}" failed. No retries permitted until 2025-09-30 13:53:26.221065324 +0000 UTC m=+1078.359625609 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift") pod "swift-storage-0" (UID: "ba72f8d4-1822-4bb5-a099-c15d4b00b701") : configmap "swift-ring-files" not found Sep 30 13:53:24 crc kubenswrapper[4763]: I0930 13:53:24.513243 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c80e6d4-9dd4-48d9-8014-65cdf079d153" path="/var/lib/kubelet/pods/9c80e6d4-9dd4-48d9-8014-65cdf079d153/volumes" Sep 30 13:53:24 crc kubenswrapper[4763]: I0930 13:53:24.607794 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjt5b" event={"ID":"2dfc595b-d82a-432b-aae3-fbde4c86b6d9","Type":"ContainerStarted","Data":"7275b433aa375ebfde3f8a8729a9be1405851843beb85604cdec2f7dec9c3c3b"} Sep 30 13:53:24 crc kubenswrapper[4763]: I0930 13:53:24.607867 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s2d6m" Sep 30 13:53:24 crc kubenswrapper[4763]: I0930 13:53:24.655541 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s2d6m"] Sep 30 13:53:24 crc kubenswrapper[4763]: I0930 13:53:24.660848 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-s2d6m"] Sep 30 13:53:25 crc kubenswrapper[4763]: I0930 13:53:25.255040 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 13:53:25 crc kubenswrapper[4763]: I0930 13:53:25.330281 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 13:53:26 crc kubenswrapper[4763]: I0930 13:53:26.265645 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:26 crc kubenswrapper[4763]: E0930 13:53:26.265824 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 13:53:26 crc kubenswrapper[4763]: E0930 13:53:26.265841 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 13:53:26 crc kubenswrapper[4763]: E0930 13:53:26.265895 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift podName:ba72f8d4-1822-4bb5-a099-c15d4b00b701 nodeName:}" failed. No retries permitted until 2025-09-30 13:53:30.265879362 +0000 UTC m=+1082.404439647 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift") pod "swift-storage-0" (UID: "ba72f8d4-1822-4bb5-a099-c15d4b00b701") : configmap "swift-ring-files" not found Sep 30 13:53:26 crc kubenswrapper[4763]: I0930 13:53:26.501214 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3794805-848d-403f-8cac-a2a6b76782c6" path="/var/lib/kubelet/pods/d3794805-848d-403f-8cac-a2a6b76782c6/volumes" Sep 30 13:53:27 crc kubenswrapper[4763]: I0930 13:53:27.239302 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 13:53:27 crc kubenswrapper[4763]: I0930 13:53:27.296863 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 13:53:27 crc kubenswrapper[4763]: I0930 13:53:27.634369 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjt5b" event={"ID":"2dfc595b-d82a-432b-aae3-fbde4c86b6d9","Type":"ContainerStarted","Data":"a6ea38cc2fc406d3ccc0ad4f22050e6ab3828a4bcfdcdf48337ef33e08b35c48"} Sep 30 13:53:27 crc kubenswrapper[4763]: I0930 13:53:27.657661 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-rjt5b" podStartSLOduration=1.299639386 podStartE2EDuration="4.65763913s" podCreationTimestamp="2025-09-30 13:53:23 +0000 UTC" firstStartedPulling="2025-09-30 13:53:24.017003949 +0000 UTC m=+1076.155564234" lastFinishedPulling="2025-09-30 13:53:27.375003693 +0000 UTC m=+1079.513563978" observedRunningTime="2025-09-30 13:53:27.648329556 +0000 UTC m=+1079.786889841" watchObservedRunningTime="2025-09-30 13:53:27.65763913 +0000 UTC m=+1079.796199405" Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.308330 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jpst6"] Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.309842 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jpst6" Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.316152 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jpst6"] Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.415721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh6st\" (UniqueName: \"kubernetes.io/projected/f6826233-27ed-40bc-9d7b-92312272f1e5-kube-api-access-nh6st\") pod \"keystone-db-create-jpst6\" (UID: \"f6826233-27ed-40bc-9d7b-92312272f1e5\") " pod="openstack/keystone-db-create-jpst6" Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.508037 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-d5jt9"] Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.509204 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d5jt9" Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.516947 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh6st\" (UniqueName: \"kubernetes.io/projected/f6826233-27ed-40bc-9d7b-92312272f1e5-kube-api-access-nh6st\") pod \"keystone-db-create-jpst6\" (UID: \"f6826233-27ed-40bc-9d7b-92312272f1e5\") " pod="openstack/keystone-db-create-jpst6" Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.518301 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-d5jt9"] Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.541149 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh6st\" (UniqueName: \"kubernetes.io/projected/f6826233-27ed-40bc-9d7b-92312272f1e5-kube-api-access-nh6st\") pod \"keystone-db-create-jpst6\" (UID: \"f6826233-27ed-40bc-9d7b-92312272f1e5\") " pod="openstack/keystone-db-create-jpst6" Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.618389 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7h8q\" (UniqueName: \"kubernetes.io/projected/5af7ed04-4a1d-4283-b6c3-e9afb4ee4675-kube-api-access-w7h8q\") pod \"placement-db-create-d5jt9\" (UID: \"5af7ed04-4a1d-4283-b6c3-e9afb4ee4675\") " pod="openstack/placement-db-create-d5jt9" Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.634038 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jpst6" Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.720172 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7h8q\" (UniqueName: \"kubernetes.io/projected/5af7ed04-4a1d-4283-b6c3-e9afb4ee4675-kube-api-access-w7h8q\") pod \"placement-db-create-d5jt9\" (UID: \"5af7ed04-4a1d-4283-b6c3-e9afb4ee4675\") " pod="openstack/placement-db-create-d5jt9" Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.743210 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7h8q\" (UniqueName: \"kubernetes.io/projected/5af7ed04-4a1d-4283-b6c3-e9afb4ee4675-kube-api-access-w7h8q\") pod \"placement-db-create-d5jt9\" (UID: \"5af7ed04-4a1d-4283-b6c3-e9afb4ee4675\") " pod="openstack/placement-db-create-d5jt9" Sep 30 13:53:29 crc kubenswrapper[4763]: I0930 13:53:29.822893 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d5jt9" Sep 30 13:53:30 crc kubenswrapper[4763]: I0930 13:53:30.053461 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jpst6"] Sep 30 13:53:30 crc kubenswrapper[4763]: I0930 13:53:30.279885 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-d5jt9"] Sep 30 13:53:30 crc kubenswrapper[4763]: W0930 13:53:30.315862 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5af7ed04_4a1d_4283_b6c3_e9afb4ee4675.slice/crio-fa06acfe37a9a307aa7ec5fd1b99c8330a1107ba507e106340e9f7429a59a05b WatchSource:0}: Error finding container fa06acfe37a9a307aa7ec5fd1b99c8330a1107ba507e106340e9f7429a59a05b: Status 404 returned error can't find the container with id fa06acfe37a9a307aa7ec5fd1b99c8330a1107ba507e106340e9f7429a59a05b Sep 30 13:53:30 crc kubenswrapper[4763]: I0930 13:53:30.338690 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:30 crc kubenswrapper[4763]: E0930 13:53:30.338937 4763 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 13:53:30 crc kubenswrapper[4763]: E0930 13:53:30.338976 4763 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 13:53:30 crc kubenswrapper[4763]: E0930 13:53:30.339050 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift podName:ba72f8d4-1822-4bb5-a099-c15d4b00b701 nodeName:}" failed. No retries permitted until 2025-09-30 13:53:38.339025874 +0000 UTC m=+1090.477586159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift") pod "swift-storage-0" (UID: "ba72f8d4-1822-4bb5-a099-c15d4b00b701") : configmap "swift-ring-files" not found Sep 30 13:53:30 crc kubenswrapper[4763]: I0930 13:53:30.663824 4763 generic.go:334] "Generic (PLEG): container finished" podID="f6826233-27ed-40bc-9d7b-92312272f1e5" containerID="ed2ff9ff9884e55f31010027f8a058be4e0e73d12107dca3cb5d5fa374a21077" exitCode=0 Sep 30 13:53:30 crc kubenswrapper[4763]: I0930 13:53:30.663927 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jpst6" event={"ID":"f6826233-27ed-40bc-9d7b-92312272f1e5","Type":"ContainerDied","Data":"ed2ff9ff9884e55f31010027f8a058be4e0e73d12107dca3cb5d5fa374a21077"} Sep 30 13:53:30 crc kubenswrapper[4763]: I0930 13:53:30.664289 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jpst6" event={"ID":"f6826233-27ed-40bc-9d7b-92312272f1e5","Type":"ContainerStarted","Data":"2f881d77f57b0e8174898caa9d20815f19181d6a58cc92414dc4a6b8f2ac322a"} Sep 30 13:53:30 crc kubenswrapper[4763]: I0930 13:53:30.666936 4763 generic.go:334] "Generic (PLEG): container finished" podID="5af7ed04-4a1d-4283-b6c3-e9afb4ee4675" containerID="1e04a4e066c062d270cdfe94173bdc4d2b4a9790730eae432b09da3c6571223b" exitCode=0 Sep 30 13:53:30 crc kubenswrapper[4763]: I0930 13:53:30.666981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d5jt9" event={"ID":"5af7ed04-4a1d-4283-b6c3-e9afb4ee4675","Type":"ContainerDied","Data":"1e04a4e066c062d270cdfe94173bdc4d2b4a9790730eae432b09da3c6571223b"} Sep 30 13:53:30 crc kubenswrapper[4763]: I0930 13:53:30.667013 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d5jt9" event={"ID":"5af7ed04-4a1d-4283-b6c3-e9afb4ee4675","Type":"ContainerStarted","Data":"fa06acfe37a9a307aa7ec5fd1b99c8330a1107ba507e106340e9f7429a59a05b"} Sep 30 13:53:31 crc kubenswrapper[4763]: I0930 13:53:31.666905 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:31 crc kubenswrapper[4763]: I0930 13:53:31.749044 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-mlfnr"] Sep 30 13:53:31 crc kubenswrapper[4763]: I0930 13:53:31.749313 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" podUID="899b170a-9f2f-4275-afcc-a78446c89728" containerName="dnsmasq-dns" containerID="cri-o://7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f" gracePeriod=10 Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.084635 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d5jt9" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.176395 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jpst6" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.177214 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7h8q\" (UniqueName: \"kubernetes.io/projected/5af7ed04-4a1d-4283-b6c3-e9afb4ee4675-kube-api-access-w7h8q\") pod \"5af7ed04-4a1d-4283-b6c3-e9afb4ee4675\" (UID: \"5af7ed04-4a1d-4283-b6c3-e9afb4ee4675\") " Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.185078 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af7ed04-4a1d-4283-b6c3-e9afb4ee4675-kube-api-access-w7h8q" (OuterVolumeSpecName: "kube-api-access-w7h8q") pod "5af7ed04-4a1d-4283-b6c3-e9afb4ee4675" (UID: "5af7ed04-4a1d-4283-b6c3-e9afb4ee4675"). InnerVolumeSpecName "kube-api-access-w7h8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.279332 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh6st\" (UniqueName: \"kubernetes.io/projected/f6826233-27ed-40bc-9d7b-92312272f1e5-kube-api-access-nh6st\") pod \"f6826233-27ed-40bc-9d7b-92312272f1e5\" (UID: \"f6826233-27ed-40bc-9d7b-92312272f1e5\") " Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.279945 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7h8q\" (UniqueName: \"kubernetes.io/projected/5af7ed04-4a1d-4283-b6c3-e9afb4ee4675-kube-api-access-w7h8q\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.281952 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6826233-27ed-40bc-9d7b-92312272f1e5-kube-api-access-nh6st" (OuterVolumeSpecName: "kube-api-access-nh6st") pod "f6826233-27ed-40bc-9d7b-92312272f1e5" (UID: "f6826233-27ed-40bc-9d7b-92312272f1e5"). InnerVolumeSpecName "kube-api-access-nh6st". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.381127 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh6st\" (UniqueName: \"kubernetes.io/projected/f6826233-27ed-40bc-9d7b-92312272f1e5-kube-api-access-nh6st\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.630105 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.685193 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jpst6" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.685168 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jpst6" event={"ID":"f6826233-27ed-40bc-9d7b-92312272f1e5","Type":"ContainerDied","Data":"2f881d77f57b0e8174898caa9d20815f19181d6a58cc92414dc4a6b8f2ac322a"} Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.685288 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f881d77f57b0e8174898caa9d20815f19181d6a58cc92414dc4a6b8f2ac322a" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.686921 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-config\") pod \"899b170a-9f2f-4275-afcc-a78446c89728\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.686978 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnxt2\" (UniqueName: \"kubernetes.io/projected/899b170a-9f2f-4275-afcc-a78446c89728-kube-api-access-pnxt2\") pod \"899b170a-9f2f-4275-afcc-a78446c89728\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.686999 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-dns-svc\") pod \"899b170a-9f2f-4275-afcc-a78446c89728\" (UID: \"899b170a-9f2f-4275-afcc-a78446c89728\") " Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.687432 4763 generic.go:334] "Generic (PLEG): container finished" podID="3119638a-6580-4a24-8e7f-40f7f7d788a5" containerID="f4cd7078c0ddc04e2c8e6651fa5ad9a35e37bd449097a1bdf9256ab5a071cf30" exitCode=0 Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.687474 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3119638a-6580-4a24-8e7f-40f7f7d788a5","Type":"ContainerDied","Data":"f4cd7078c0ddc04e2c8e6651fa5ad9a35e37bd449097a1bdf9256ab5a071cf30"} Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.693765 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/899b170a-9f2f-4275-afcc-a78446c89728-kube-api-access-pnxt2" (OuterVolumeSpecName: "kube-api-access-pnxt2") pod "899b170a-9f2f-4275-afcc-a78446c89728" (UID: "899b170a-9f2f-4275-afcc-a78446c89728"). InnerVolumeSpecName "kube-api-access-pnxt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.702214 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d5jt9" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.702231 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d5jt9" event={"ID":"5af7ed04-4a1d-4283-b6c3-e9afb4ee4675","Type":"ContainerDied","Data":"fa06acfe37a9a307aa7ec5fd1b99c8330a1107ba507e106340e9f7429a59a05b"} Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.702270 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa06acfe37a9a307aa7ec5fd1b99c8330a1107ba507e106340e9f7429a59a05b" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.707797 4763 generic.go:334] "Generic (PLEG): container finished" podID="899b170a-9f2f-4275-afcc-a78446c89728" containerID="7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f" exitCode=0 Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.707860 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" event={"ID":"899b170a-9f2f-4275-afcc-a78446c89728","Type":"ContainerDied","Data":"7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f"} Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.707867 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.707886 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-mlfnr" event={"ID":"899b170a-9f2f-4275-afcc-a78446c89728","Type":"ContainerDied","Data":"d100aa5a9c39898d37d500e316b65b8fe2af3bbfc2c6ff0c46be290964292176"} Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.707903 4763 scope.go:117] "RemoveContainer" containerID="7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.714990 4763 generic.go:334] "Generic (PLEG): container finished" podID="aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" containerID="ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707" exitCode=0 Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.715027 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c","Type":"ContainerDied","Data":"ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707"} Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.732698 4763 scope.go:117] "RemoveContainer" containerID="3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.750488 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-config" (OuterVolumeSpecName: "config") pod "899b170a-9f2f-4275-afcc-a78446c89728" (UID: "899b170a-9f2f-4275-afcc-a78446c89728"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.750620 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "899b170a-9f2f-4275-afcc-a78446c89728" (UID: "899b170a-9f2f-4275-afcc-a78446c89728"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.752955 4763 scope.go:117] "RemoveContainer" containerID="7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f" Sep 30 13:53:32 crc kubenswrapper[4763]: E0930 13:53:32.753278 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f\": container with ID starting with 7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f not found: ID does not exist" containerID="7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.753310 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f"} err="failed to get container status \"7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f\": rpc error: code = NotFound desc = could not find container \"7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f\": container with ID starting with 7077a53bf5318b2779af56c88a6f1588a0ee3f9f3cd5cd6cde22994a7946ff7f not found: ID does not exist" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.753328 4763 scope.go:117] "RemoveContainer" containerID="3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae" Sep 30 13:53:32 crc kubenswrapper[4763]: E0930 13:53:32.753669 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae\": container with ID starting with 3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae not found: ID does not exist" containerID="3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.753714 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae"} err="failed to get container status \"3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae\": rpc error: code = NotFound desc = could not find container \"3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae\": container with ID starting with 3a20e4d4dfb70d194d2b1ce810de4ebd7a7633126a43b1296fb41c26ee901fae not found: ID does not exist" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.790114 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.790142 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/899b170a-9f2f-4275-afcc-a78446c89728-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:32 crc kubenswrapper[4763]: I0930 13:53:32.790152 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnxt2\" (UniqueName: \"kubernetes.io/projected/899b170a-9f2f-4275-afcc-a78446c89728-kube-api-access-pnxt2\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:33 crc kubenswrapper[4763]: I0930 13:53:33.037866 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-mlfnr"] Sep 30 13:53:33 crc kubenswrapper[4763]: I0930 13:53:33.048037 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-mlfnr"] Sep 30 13:53:33 crc kubenswrapper[4763]: I0930 13:53:33.725851 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c","Type":"ContainerStarted","Data":"28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9"} Sep 30 13:53:33 crc kubenswrapper[4763]: I0930 13:53:33.726071 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:53:33 crc kubenswrapper[4763]: I0930 13:53:33.728666 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3119638a-6580-4a24-8e7f-40f7f7d788a5","Type":"ContainerStarted","Data":"c5af37dfd26586dfbe5d5f60114f298ea522d4e3bbbc87c8e965efa23a5cf953"} Sep 30 13:53:33 crc kubenswrapper[4763]: I0930 13:53:33.729172 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 13:53:33 crc kubenswrapper[4763]: I0930 13:53:33.749380 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.873748109 podStartE2EDuration="48.749362302s" podCreationTimestamp="2025-09-30 13:52:45 +0000 UTC" firstStartedPulling="2025-09-30 13:52:47.052260216 +0000 UTC m=+1039.190820501" lastFinishedPulling="2025-09-30 13:53:00.927874409 +0000 UTC m=+1053.066434694" observedRunningTime="2025-09-30 13:53:33.749198678 +0000 UTC m=+1085.887758973" watchObservedRunningTime="2025-09-30 13:53:33.749362302 +0000 UTC m=+1085.887922597" Sep 30 13:53:33 crc kubenswrapper[4763]: I0930 13:53:33.777563 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.529970816 podStartE2EDuration="49.7775468s" podCreationTimestamp="2025-09-30 13:52:44 +0000 UTC" firstStartedPulling="2025-09-30 13:52:46.761011652 +0000 UTC m=+1038.899571937" lastFinishedPulling="2025-09-30 13:53:01.008587636 +0000 UTC m=+1053.147147921" observedRunningTime="2025-09-30 13:53:33.776143475 +0000 UTC m=+1085.914703750" watchObservedRunningTime="2025-09-30 13:53:33.7775468 +0000 UTC m=+1085.916107095" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.498008 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899b170a-9f2f-4275-afcc-a78446c89728" path="/var/lib/kubelet/pods/899b170a-9f2f-4275-afcc-a78446c89728/volumes" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.735871 4763 generic.go:334] "Generic (PLEG): container finished" podID="2dfc595b-d82a-432b-aae3-fbde4c86b6d9" containerID="a6ea38cc2fc406d3ccc0ad4f22050e6ab3828a4bcfdcdf48337ef33e08b35c48" exitCode=0 Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.735982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjt5b" event={"ID":"2dfc595b-d82a-432b-aae3-fbde4c86b6d9","Type":"ContainerDied","Data":"a6ea38cc2fc406d3ccc0ad4f22050e6ab3828a4bcfdcdf48337ef33e08b35c48"} Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.831257 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lz7ln"] Sep 30 13:53:34 crc kubenswrapper[4763]: E0930 13:53:34.831858 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899b170a-9f2f-4275-afcc-a78446c89728" containerName="init" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.831880 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="899b170a-9f2f-4275-afcc-a78446c89728" containerName="init" Sep 30 13:53:34 crc kubenswrapper[4763]: E0930 13:53:34.831891 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6826233-27ed-40bc-9d7b-92312272f1e5" containerName="mariadb-database-create" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.831897 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6826233-27ed-40bc-9d7b-92312272f1e5" containerName="mariadb-database-create" Sep 30 13:53:34 crc kubenswrapper[4763]: E0930 13:53:34.831907 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af7ed04-4a1d-4283-b6c3-e9afb4ee4675" containerName="mariadb-database-create" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.831914 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af7ed04-4a1d-4283-b6c3-e9afb4ee4675" containerName="mariadb-database-create" Sep 30 13:53:34 crc kubenswrapper[4763]: E0930 13:53:34.831942 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899b170a-9f2f-4275-afcc-a78446c89728" containerName="dnsmasq-dns" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.831950 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="899b170a-9f2f-4275-afcc-a78446c89728" containerName="dnsmasq-dns" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.832132 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af7ed04-4a1d-4283-b6c3-e9afb4ee4675" containerName="mariadb-database-create" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.832160 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6826233-27ed-40bc-9d7b-92312272f1e5" containerName="mariadb-database-create" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.832173 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="899b170a-9f2f-4275-afcc-a78446c89728" containerName="dnsmasq-dns" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.832787 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lz7ln" Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.844996 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lz7ln"] Sep 30 13:53:34 crc kubenswrapper[4763]: I0930 13:53:34.929900 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkzkz\" (UniqueName: \"kubernetes.io/projected/f17c5ea5-8c54-4352-be0f-60a61fb6b7ba-kube-api-access-jkzkz\") pod \"glance-db-create-lz7ln\" (UID: \"f17c5ea5-8c54-4352-be0f-60a61fb6b7ba\") " pod="openstack/glance-db-create-lz7ln" Sep 30 13:53:35 crc kubenswrapper[4763]: I0930 13:53:35.032518 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzkz\" (UniqueName: \"kubernetes.io/projected/f17c5ea5-8c54-4352-be0f-60a61fb6b7ba-kube-api-access-jkzkz\") pod \"glance-db-create-lz7ln\" (UID: \"f17c5ea5-8c54-4352-be0f-60a61fb6b7ba\") " pod="openstack/glance-db-create-lz7ln" Sep 30 13:53:35 crc kubenswrapper[4763]: I0930 13:53:35.073077 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkzkz\" (UniqueName: \"kubernetes.io/projected/f17c5ea5-8c54-4352-be0f-60a61fb6b7ba-kube-api-access-jkzkz\") pod \"glance-db-create-lz7ln\" (UID: \"f17c5ea5-8c54-4352-be0f-60a61fb6b7ba\") " pod="openstack/glance-db-create-lz7ln" Sep 30 13:53:35 crc kubenswrapper[4763]: I0930 13:53:35.148797 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lz7ln" Sep 30 13:53:35 crc kubenswrapper[4763]: I0930 13:53:35.325949 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 13:53:35 crc kubenswrapper[4763]: I0930 13:53:35.571306 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lz7ln"] Sep 30 13:53:35 crc kubenswrapper[4763]: I0930 13:53:35.745183 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lz7ln" event={"ID":"f17c5ea5-8c54-4352-be0f-60a61fb6b7ba","Type":"ContainerStarted","Data":"e2e055db9fefff8aff0fd24fed8b68b4440bb95925f9d67dbbb38fcff4a3852f"} Sep 30 13:53:35 crc kubenswrapper[4763]: I0930 13:53:35.745409 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lz7ln" event={"ID":"f17c5ea5-8c54-4352-be0f-60a61fb6b7ba","Type":"ContainerStarted","Data":"7f9a180721542c1f25423356c48089caafdce651797c76181fc2c4678947cafd"} Sep 30 13:53:35 crc kubenswrapper[4763]: I0930 13:53:35.765510 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-lz7ln" podStartSLOduration=1.7654863299999999 podStartE2EDuration="1.76548633s" podCreationTimestamp="2025-09-30 13:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:53:35.757791906 +0000 UTC m=+1087.896352191" watchObservedRunningTime="2025-09-30 13:53:35.76548633 +0000 UTC m=+1087.904046615" Sep 30 13:53:35 crc kubenswrapper[4763]: I0930 13:53:35.959554 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.049337 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-scripts\") pod \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.049406 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-dispersionconf\") pod \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.049452 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-combined-ca-bundle\") pod \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.049553 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-swiftconf\") pod \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.049575 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nzf5\" (UniqueName: \"kubernetes.io/projected/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-kube-api-access-2nzf5\") pod \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.049622 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-ring-data-devices\") pod \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.049641 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-etc-swift\") pod \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\" (UID: \"2dfc595b-d82a-432b-aae3-fbde4c86b6d9\") " Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.050171 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2dfc595b-d82a-432b-aae3-fbde4c86b6d9" (UID: "2dfc595b-d82a-432b-aae3-fbde4c86b6d9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.050705 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2dfc595b-d82a-432b-aae3-fbde4c86b6d9" (UID: "2dfc595b-d82a-432b-aae3-fbde4c86b6d9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.056328 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-kube-api-access-2nzf5" (OuterVolumeSpecName: "kube-api-access-2nzf5") pod "2dfc595b-d82a-432b-aae3-fbde4c86b6d9" (UID: "2dfc595b-d82a-432b-aae3-fbde4c86b6d9"). InnerVolumeSpecName "kube-api-access-2nzf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.056702 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2dfc595b-d82a-432b-aae3-fbde4c86b6d9" (UID: "2dfc595b-d82a-432b-aae3-fbde4c86b6d9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.060116 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.060184 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.085318 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dfc595b-d82a-432b-aae3-fbde4c86b6d9" (UID: "2dfc595b-d82a-432b-aae3-fbde4c86b6d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.090542 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2dfc595b-d82a-432b-aae3-fbde4c86b6d9" (UID: "2dfc595b-d82a-432b-aae3-fbde4c86b6d9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.105132 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-scripts" (OuterVolumeSpecName: "scripts") pod "2dfc595b-d82a-432b-aae3-fbde4c86b6d9" (UID: "2dfc595b-d82a-432b-aae3-fbde4c86b6d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.151507 4763 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.151542 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nzf5\" (UniqueName: \"kubernetes.io/projected/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-kube-api-access-2nzf5\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.151552 4763 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.151560 4763 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.151568 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.151578 4763 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.151586 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfc595b-d82a-432b-aae3-fbde4c86b6d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.755967 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjt5b" event={"ID":"2dfc595b-d82a-432b-aae3-fbde4c86b6d9","Type":"ContainerDied","Data":"7275b433aa375ebfde3f8a8729a9be1405851843beb85604cdec2f7dec9c3c3b"} Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.757123 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7275b433aa375ebfde3f8a8729a9be1405851843beb85604cdec2f7dec9c3c3b" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.755988 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjt5b" Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.757883 4763 generic.go:334] "Generic (PLEG): container finished" podID="f17c5ea5-8c54-4352-be0f-60a61fb6b7ba" containerID="e2e055db9fefff8aff0fd24fed8b68b4440bb95925f9d67dbbb38fcff4a3852f" exitCode=0 Sep 30 13:53:36 crc kubenswrapper[4763]: I0930 13:53:36.757923 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lz7ln" event={"ID":"f17c5ea5-8c54-4352-be0f-60a61fb6b7ba","Type":"ContainerDied","Data":"e2e055db9fefff8aff0fd24fed8b68b4440bb95925f9d67dbbb38fcff4a3852f"} Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.035993 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lz7ln" Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.182613 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkzkz\" (UniqueName: \"kubernetes.io/projected/f17c5ea5-8c54-4352-be0f-60a61fb6b7ba-kube-api-access-jkzkz\") pod \"f17c5ea5-8c54-4352-be0f-60a61fb6b7ba\" (UID: \"f17c5ea5-8c54-4352-be0f-60a61fb6b7ba\") " Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.188485 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f17c5ea5-8c54-4352-be0f-60a61fb6b7ba-kube-api-access-jkzkz" (OuterVolumeSpecName: "kube-api-access-jkzkz") pod "f17c5ea5-8c54-4352-be0f-60a61fb6b7ba" (UID: "f17c5ea5-8c54-4352-be0f-60a61fb6b7ba"). InnerVolumeSpecName "kube-api-access-jkzkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.284352 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkzkz\" (UniqueName: \"kubernetes.io/projected/f17c5ea5-8c54-4352-be0f-60a61fb6b7ba-kube-api-access-jkzkz\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.385492 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.390641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift\") pod \"swift-storage-0\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " pod="openstack/swift-storage-0" Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.413061 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.775781 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lz7ln" event={"ID":"f17c5ea5-8c54-4352-be0f-60a61fb6b7ba","Type":"ContainerDied","Data":"7f9a180721542c1f25423356c48089caafdce651797c76181fc2c4678947cafd"} Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.776056 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f9a180721542c1f25423356c48089caafdce651797c76181fc2c4678947cafd" Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.776110 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lz7ln" Sep 30 13:53:38 crc kubenswrapper[4763]: I0930 13:53:38.912424 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 13:53:38 crc kubenswrapper[4763]: W0930 13:53:38.919486 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba72f8d4_1822_4bb5_a099_c15d4b00b701.slice/crio-8396bfe7c730495814b6c31d1b6eec95410f28177e3bd00d50d87a223d392f14 WatchSource:0}: Error finding container 8396bfe7c730495814b6c31d1b6eec95410f28177e3bd00d50d87a223d392f14: Status 404 returned error can't find the container with id 8396bfe7c730495814b6c31d1b6eec95410f28177e3bd00d50d87a223d392f14 Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.345363 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fac2-account-create-757bn"] Sep 30 13:53:39 crc kubenswrapper[4763]: E0930 13:53:39.345679 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfc595b-d82a-432b-aae3-fbde4c86b6d9" containerName="swift-ring-rebalance" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.345691 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfc595b-d82a-432b-aae3-fbde4c86b6d9" containerName="swift-ring-rebalance" Sep 30 13:53:39 crc kubenswrapper[4763]: E0930 13:53:39.345711 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f17c5ea5-8c54-4352-be0f-60a61fb6b7ba" containerName="mariadb-database-create" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.345718 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17c5ea5-8c54-4352-be0f-60a61fb6b7ba" containerName="mariadb-database-create" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.345876 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f17c5ea5-8c54-4352-be0f-60a61fb6b7ba" containerName="mariadb-database-create" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.345895 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dfc595b-d82a-432b-aae3-fbde4c86b6d9" containerName="swift-ring-rebalance" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.346357 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fac2-account-create-757bn" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.348179 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.364618 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fac2-account-create-757bn"] Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.508902 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6qsc\" (UniqueName: \"kubernetes.io/projected/573ae1be-3060-4fea-a7fe-b7feeaa60cc7-kube-api-access-n6qsc\") pod \"keystone-fac2-account-create-757bn\" (UID: \"573ae1be-3060-4fea-a7fe-b7feeaa60cc7\") " pod="openstack/keystone-fac2-account-create-757bn" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.611229 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6qsc\" (UniqueName: \"kubernetes.io/projected/573ae1be-3060-4fea-a7fe-b7feeaa60cc7-kube-api-access-n6qsc\") pod \"keystone-fac2-account-create-757bn\" (UID: \"573ae1be-3060-4fea-a7fe-b7feeaa60cc7\") " pod="openstack/keystone-fac2-account-create-757bn" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.631320 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6qsc\" (UniqueName: \"kubernetes.io/projected/573ae1be-3060-4fea-a7fe-b7feeaa60cc7-kube-api-access-n6qsc\") pod \"keystone-fac2-account-create-757bn\" (UID: \"573ae1be-3060-4fea-a7fe-b7feeaa60cc7\") " pod="openstack/keystone-fac2-account-create-757bn" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.653926 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b89d-account-create-d82w4"] Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.655089 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b89d-account-create-d82w4" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.664159 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fac2-account-create-757bn" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.664254 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b89d-account-create-d82w4"] Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.664768 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.785271 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"8396bfe7c730495814b6c31d1b6eec95410f28177e3bd00d50d87a223d392f14"} Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.816578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvnb7\" (UniqueName: \"kubernetes.io/projected/6a3eeb39-e6e2-4824-8022-fd652d13ed03-kube-api-access-hvnb7\") pod \"placement-b89d-account-create-d82w4\" (UID: \"6a3eeb39-e6e2-4824-8022-fd652d13ed03\") " pod="openstack/placement-b89d-account-create-d82w4" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.917801 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvnb7\" (UniqueName: \"kubernetes.io/projected/6a3eeb39-e6e2-4824-8022-fd652d13ed03-kube-api-access-hvnb7\") pod \"placement-b89d-account-create-d82w4\" (UID: \"6a3eeb39-e6e2-4824-8022-fd652d13ed03\") " pod="openstack/placement-b89d-account-create-d82w4" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.937934 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvnb7\" (UniqueName: \"kubernetes.io/projected/6a3eeb39-e6e2-4824-8022-fd652d13ed03-kube-api-access-hvnb7\") pod \"placement-b89d-account-create-d82w4\" (UID: \"6a3eeb39-e6e2-4824-8022-fd652d13ed03\") " pod="openstack/placement-b89d-account-create-d82w4" Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.990095 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kwz5v" podUID="1db73295-0655-443c-91e0-2cd08b119141" containerName="ovn-controller" probeResult="failure" output=< Sep 30 13:53:39 crc kubenswrapper[4763]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 13:53:39 crc kubenswrapper[4763]: > Sep 30 13:53:39 crc kubenswrapper[4763]: I0930 13:53:39.992878 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b89d-account-create-d82w4" Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.074067 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.334256 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fac2-account-create-757bn"] Sep 30 13:53:40 crc kubenswrapper[4763]: W0930 13:53:40.342839 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod573ae1be_3060_4fea_a7fe_b7feeaa60cc7.slice/crio-de6b6de070c1b8c6d1dd107baf317b645bc1957f87d4aa7cc6188c461c44fedb WatchSource:0}: Error finding container de6b6de070c1b8c6d1dd107baf317b645bc1957f87d4aa7cc6188c461c44fedb: Status 404 returned error can't find the container with id de6b6de070c1b8c6d1dd107baf317b645bc1957f87d4aa7cc6188c461c44fedb Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.450859 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b89d-account-create-d82w4"] Sep 30 13:53:40 crc kubenswrapper[4763]: W0930 13:53:40.470357 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a3eeb39_e6e2_4824_8022_fd652d13ed03.slice/crio-3b4ba07215f9975599b07f78041bf7490b2154d8d0b57e50327aab3d8c8180bd WatchSource:0}: Error finding container 3b4ba07215f9975599b07f78041bf7490b2154d8d0b57e50327aab3d8c8180bd: Status 404 returned error can't find the container with id 3b4ba07215f9975599b07f78041bf7490b2154d8d0b57e50327aab3d8c8180bd Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.796036 4763 generic.go:334] "Generic (PLEG): container finished" podID="6a3eeb39-e6e2-4824-8022-fd652d13ed03" containerID="72f6ad706fff18c900a95959f4f354c1438166c8053468b7bedfc284a8853597" exitCode=0 Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.796130 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b89d-account-create-d82w4" event={"ID":"6a3eeb39-e6e2-4824-8022-fd652d13ed03","Type":"ContainerDied","Data":"72f6ad706fff18c900a95959f4f354c1438166c8053468b7bedfc284a8853597"} Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.796503 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b89d-account-create-d82w4" event={"ID":"6a3eeb39-e6e2-4824-8022-fd652d13ed03","Type":"ContainerStarted","Data":"3b4ba07215f9975599b07f78041bf7490b2154d8d0b57e50327aab3d8c8180bd"} Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.798320 4763 generic.go:334] "Generic (PLEG): container finished" podID="573ae1be-3060-4fea-a7fe-b7feeaa60cc7" containerID="170b6b6c50e40279bb612fcdfa9b24046c60eb591063778e915b4267e9aacc17" exitCode=0 Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.798383 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fac2-account-create-757bn" event={"ID":"573ae1be-3060-4fea-a7fe-b7feeaa60cc7","Type":"ContainerDied","Data":"170b6b6c50e40279bb612fcdfa9b24046c60eb591063778e915b4267e9aacc17"} Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.798404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fac2-account-create-757bn" event={"ID":"573ae1be-3060-4fea-a7fe-b7feeaa60cc7","Type":"ContainerStarted","Data":"de6b6de070c1b8c6d1dd107baf317b645bc1957f87d4aa7cc6188c461c44fedb"} Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.801408 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"1bb4e132326be55cfb6d2c02cfd640df1ebca518cc39286f9fe76a41c347dda4"} Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.801518 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"18e1cb42d1ac47e256e5579a10290ba641e04de49adb3a3798799607a90f1b1a"} Sep 30 13:53:40 crc kubenswrapper[4763]: I0930 13:53:40.801536 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"dfe4428a4ee91686c8b839dc094b2cea3d884fe055392d209003e50ad9cecb05"} Sep 30 13:53:41 crc kubenswrapper[4763]: I0930 13:53:41.812588 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"6913f1a8c201da716c04a9052d361c35e1f0beafd7a800065007dd41db8b8e2f"} Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.314055 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fac2-account-create-757bn" Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.327281 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b89d-account-create-d82w4" Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.458848 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6qsc\" (UniqueName: \"kubernetes.io/projected/573ae1be-3060-4fea-a7fe-b7feeaa60cc7-kube-api-access-n6qsc\") pod \"573ae1be-3060-4fea-a7fe-b7feeaa60cc7\" (UID: \"573ae1be-3060-4fea-a7fe-b7feeaa60cc7\") " Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.458922 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvnb7\" (UniqueName: \"kubernetes.io/projected/6a3eeb39-e6e2-4824-8022-fd652d13ed03-kube-api-access-hvnb7\") pod \"6a3eeb39-e6e2-4824-8022-fd652d13ed03\" (UID: \"6a3eeb39-e6e2-4824-8022-fd652d13ed03\") " Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.463911 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3eeb39-e6e2-4824-8022-fd652d13ed03-kube-api-access-hvnb7" (OuterVolumeSpecName: "kube-api-access-hvnb7") pod "6a3eeb39-e6e2-4824-8022-fd652d13ed03" (UID: "6a3eeb39-e6e2-4824-8022-fd652d13ed03"). InnerVolumeSpecName "kube-api-access-hvnb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.464390 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/573ae1be-3060-4fea-a7fe-b7feeaa60cc7-kube-api-access-n6qsc" (OuterVolumeSpecName: "kube-api-access-n6qsc") pod "573ae1be-3060-4fea-a7fe-b7feeaa60cc7" (UID: "573ae1be-3060-4fea-a7fe-b7feeaa60cc7"). InnerVolumeSpecName "kube-api-access-n6qsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.562699 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6qsc\" (UniqueName: \"kubernetes.io/projected/573ae1be-3060-4fea-a7fe-b7feeaa60cc7-kube-api-access-n6qsc\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.562748 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvnb7\" (UniqueName: \"kubernetes.io/projected/6a3eeb39-e6e2-4824-8022-fd652d13ed03-kube-api-access-hvnb7\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.821525 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b89d-account-create-d82w4" Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.821545 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b89d-account-create-d82w4" event={"ID":"6a3eeb39-e6e2-4824-8022-fd652d13ed03","Type":"ContainerDied","Data":"3b4ba07215f9975599b07f78041bf7490b2154d8d0b57e50327aab3d8c8180bd"} Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.822036 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b4ba07215f9975599b07f78041bf7490b2154d8d0b57e50327aab3d8c8180bd" Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.823295 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fac2-account-create-757bn" event={"ID":"573ae1be-3060-4fea-a7fe-b7feeaa60cc7","Type":"ContainerDied","Data":"de6b6de070c1b8c6d1dd107baf317b645bc1957f87d4aa7cc6188c461c44fedb"} Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.823324 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fac2-account-create-757bn" Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.823328 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de6b6de070c1b8c6d1dd107baf317b645bc1957f87d4aa7cc6188c461c44fedb" Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.827547 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"86fa7c116448649efc303d132f55a3b4d51ce4ff7728e8cb83a546ae8cc6be04"} Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.827582 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"17bbda96e72abf0e4fc5b512a5a8c030ec54be5d2e9697b12e425013fc6e5674"} Sep 30 13:53:42 crc kubenswrapper[4763]: I0930 13:53:42.827609 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"e59cd93a64db4c4a2e52fd8dec840f2e642f02a5898d661bf7aeab73f09ef3a3"} Sep 30 13:53:43 crc kubenswrapper[4763]: I0930 13:53:43.841577 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"580d3253de72fffc16d0a36d6429d3d8a5a8907a3681e3d5a00508e74a43aeff"} Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.854426 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"2e13d0b0d4911e364ee6a3df6a55c9fe084a5532f8df7d0fcfa8239cfa1bd7d8"} Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.854750 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"68319e480d02549a9670870fb2b799e7a229e796a6e2e64c34a0f931f5c2294a"} Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.854760 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"242dc53e835ef062c4e6ffb487f5cb2cd09de49af6b5aef18aae943dfe19b104"} Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.854768 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"7af9dffe8b6aec08e0ffc071adb335564cdf7ec832db594c4069392c84f63460"} Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.854776 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"da7282808861470139cef025a99057cbd65aa13cb0bfc0317356866852f5d03d"} Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.994937 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9e56-account-create-dxz6j"] Sep 30 13:53:44 crc kubenswrapper[4763]: E0930 13:53:44.995556 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3eeb39-e6e2-4824-8022-fd652d13ed03" containerName="mariadb-account-create" Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.995682 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3eeb39-e6e2-4824-8022-fd652d13ed03" containerName="mariadb-account-create" Sep 30 13:53:44 crc kubenswrapper[4763]: E0930 13:53:44.995781 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573ae1be-3060-4fea-a7fe-b7feeaa60cc7" containerName="mariadb-account-create" Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.995854 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="573ae1be-3060-4fea-a7fe-b7feeaa60cc7" containerName="mariadb-account-create" Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.996106 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3eeb39-e6e2-4824-8022-fd652d13ed03" containerName="mariadb-account-create" Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.996215 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="573ae1be-3060-4fea-a7fe-b7feeaa60cc7" containerName="mariadb-account-create" Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.996874 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e56-account-create-dxz6j" Sep 30 13:53:44 crc kubenswrapper[4763]: I0930 13:53:44.999209 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.002342 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kwz5v" podUID="1db73295-0655-443c-91e0-2cd08b119141" containerName="ovn-controller" probeResult="failure" output=< Sep 30 13:53:45 crc kubenswrapper[4763]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 13:53:45 crc kubenswrapper[4763]: > Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.003090 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9e56-account-create-dxz6j"] Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.098873 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq6nh\" (UniqueName: \"kubernetes.io/projected/62310a9e-9a81-44c2-96f2-9e7064f883e9-kube-api-access-qq6nh\") pod \"glance-9e56-account-create-dxz6j\" (UID: \"62310a9e-9a81-44c2-96f2-9e7064f883e9\") " pod="openstack/glance-9e56-account-create-dxz6j" Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.200665 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq6nh\" (UniqueName: \"kubernetes.io/projected/62310a9e-9a81-44c2-96f2-9e7064f883e9-kube-api-access-qq6nh\") pod \"glance-9e56-account-create-dxz6j\" (UID: \"62310a9e-9a81-44c2-96f2-9e7064f883e9\") " pod="openstack/glance-9e56-account-create-dxz6j" Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.221235 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq6nh\" (UniqueName: \"kubernetes.io/projected/62310a9e-9a81-44c2-96f2-9e7064f883e9-kube-api-access-qq6nh\") pod \"glance-9e56-account-create-dxz6j\" (UID: \"62310a9e-9a81-44c2-96f2-9e7064f883e9\") " pod="openstack/glance-9e56-account-create-dxz6j" Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.337405 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e56-account-create-dxz6j" Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.586364 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9e56-account-create-dxz6j"] Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.863211 4763 generic.go:334] "Generic (PLEG): container finished" podID="62310a9e-9a81-44c2-96f2-9e7064f883e9" containerID="ed161b44111ccbcbdc21277ce31dee7a2c0f706c282e5185255018b02b3a8ce8" exitCode=0 Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.863285 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e56-account-create-dxz6j" event={"ID":"62310a9e-9a81-44c2-96f2-9e7064f883e9","Type":"ContainerDied","Data":"ed161b44111ccbcbdc21277ce31dee7a2c0f706c282e5185255018b02b3a8ce8"} Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.864434 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e56-account-create-dxz6j" event={"ID":"62310a9e-9a81-44c2-96f2-9e7064f883e9","Type":"ContainerStarted","Data":"463c592547a342dac41c7b351e4e83814f19f0a9c71211f0b60b5e27b4ebfa0a"} Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.877711 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"c52de0c97e78063fc806d8831c3e0f7eba864de7670f488d153c3e4e13e7df72"} Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.877791 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerStarted","Data":"a2c552587c9daa3eff2f6b01626bdb8930edcce9d10cefa5e6f2138456bab7ae"} Sep 30 13:53:45 crc kubenswrapper[4763]: I0930 13:53:45.908079 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.003386221 podStartE2EDuration="24.908052752s" podCreationTimestamp="2025-09-30 13:53:21 +0000 UTC" firstStartedPulling="2025-09-30 13:53:38.921794609 +0000 UTC m=+1091.060354894" lastFinishedPulling="2025-09-30 13:53:43.82646114 +0000 UTC m=+1095.965021425" observedRunningTime="2025-09-30 13:53:45.906650436 +0000 UTC m=+1098.045210751" watchObservedRunningTime="2025-09-30 13:53:45.908052752 +0000 UTC m=+1098.046613037" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.146035 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-jllqw"] Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.147665 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.150126 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.162647 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-jllqw"] Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.318353 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.318415 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-config\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.318740 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-svc\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.318887 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.318923 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.319008 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw94r\" (UniqueName: \"kubernetes.io/projected/fc62d0c3-527f-414f-ac05-3f788460ed17-kube-api-access-lw94r\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.330784 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.420538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.420584 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-config\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.420681 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-svc\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.420752 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.420787 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.420844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw94r\" (UniqueName: \"kubernetes.io/projected/fc62d0c3-527f-414f-ac05-3f788460ed17-kube-api-access-lw94r\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.423387 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-svc\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.423489 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.423698 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-config\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.424829 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.439474 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.456781 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw94r\" (UniqueName: \"kubernetes.io/projected/fc62d0c3-527f-414f-ac05-3f788460ed17-kube-api-access-lw94r\") pod \"dnsmasq-dns-6cfbb96789-jllqw\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.464770 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.634441 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-dnvd2"] Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.636014 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dnvd2" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.657777 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.696212 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dnvd2"] Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.730820 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgbwg\" (UniqueName: \"kubernetes.io/projected/303b3bc4-dd2a-4f55-8961-31033f17652c-kube-api-access-rgbwg\") pod \"barbican-db-create-dnvd2\" (UID: \"303b3bc4-dd2a-4f55-8961-31033f17652c\") " pod="openstack/barbican-db-create-dnvd2" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.834918 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgbwg\" (UniqueName: \"kubernetes.io/projected/303b3bc4-dd2a-4f55-8961-31033f17652c-kube-api-access-rgbwg\") pod \"barbican-db-create-dnvd2\" (UID: \"303b3bc4-dd2a-4f55-8961-31033f17652c\") " pod="openstack/barbican-db-create-dnvd2" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.841496 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-7ffj9"] Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.842734 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7ffj9" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.857258 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7ffj9"] Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.874838 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgbwg\" (UniqueName: \"kubernetes.io/projected/303b3bc4-dd2a-4f55-8961-31033f17652c-kube-api-access-rgbwg\") pod \"barbican-db-create-dnvd2\" (UID: \"303b3bc4-dd2a-4f55-8961-31033f17652c\") " pod="openstack/barbican-db-create-dnvd2" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.935744 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zjwcr"] Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.936338 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2278m\" (UniqueName: \"kubernetes.io/projected/0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e-kube-api-access-2278m\") pod \"cinder-db-create-7ffj9\" (UID: \"0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e\") " pod="openstack/cinder-db-create-7ffj9" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.938459 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zjwcr" Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.943490 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zjwcr"] Sep 30 13:53:46 crc kubenswrapper[4763]: I0930 13:53:46.974249 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dnvd2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.038534 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw2zw\" (UniqueName: \"kubernetes.io/projected/3ed6a710-9784-49a1-aa61-59c509f2ff3d-kube-api-access-rw2zw\") pod \"neutron-db-create-zjwcr\" (UID: \"3ed6a710-9784-49a1-aa61-59c509f2ff3d\") " pod="openstack/neutron-db-create-zjwcr" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.038767 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2278m\" (UniqueName: \"kubernetes.io/projected/0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e-kube-api-access-2278m\") pod \"cinder-db-create-7ffj9\" (UID: \"0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e\") " pod="openstack/cinder-db-create-7ffj9" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.062479 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2278m\" (UniqueName: \"kubernetes.io/projected/0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e-kube-api-access-2278m\") pod \"cinder-db-create-7ffj9\" (UID: \"0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e\") " pod="openstack/cinder-db-create-7ffj9" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.114675 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-jllqw"] Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.120970 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-7hcz2"] Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.122056 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.124341 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.124715 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pj6pt" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.124857 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.125011 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 13:53:47 crc kubenswrapper[4763]: W0930 13:53:47.131629 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc62d0c3_527f_414f_ac05_3f788460ed17.slice/crio-cbe05eec355533b9f4d42930465b94c9b197c3bf492c968b8c3831c3690062b4 WatchSource:0}: Error finding container cbe05eec355533b9f4d42930465b94c9b197c3bf492c968b8c3831c3690062b4: Status 404 returned error can't find the container with id cbe05eec355533b9f4d42930465b94c9b197c3bf492c968b8c3831c3690062b4 Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.140014 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7hcz2"] Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.141586 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw2zw\" (UniqueName: \"kubernetes.io/projected/3ed6a710-9784-49a1-aa61-59c509f2ff3d-kube-api-access-rw2zw\") pod \"neutron-db-create-zjwcr\" (UID: \"3ed6a710-9784-49a1-aa61-59c509f2ff3d\") " pod="openstack/neutron-db-create-zjwcr" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.159967 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw2zw\" (UniqueName: \"kubernetes.io/projected/3ed6a710-9784-49a1-aa61-59c509f2ff3d-kube-api-access-rw2zw\") pod \"neutron-db-create-zjwcr\" (UID: \"3ed6a710-9784-49a1-aa61-59c509f2ff3d\") " pod="openstack/neutron-db-create-zjwcr" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.169429 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7ffj9" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.243645 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-combined-ca-bundle\") pod \"keystone-db-sync-7hcz2\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.243718 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jglll\" (UniqueName: \"kubernetes.io/projected/db237d2f-d736-42bd-bd84-dc9d93909367-kube-api-access-jglll\") pod \"keystone-db-sync-7hcz2\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.243922 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-config-data\") pod \"keystone-db-sync-7hcz2\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.244508 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e56-account-create-dxz6j" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.262350 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zjwcr" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.345295 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq6nh\" (UniqueName: \"kubernetes.io/projected/62310a9e-9a81-44c2-96f2-9e7064f883e9-kube-api-access-qq6nh\") pod \"62310a9e-9a81-44c2-96f2-9e7064f883e9\" (UID: \"62310a9e-9a81-44c2-96f2-9e7064f883e9\") " Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.345549 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-config-data\") pod \"keystone-db-sync-7hcz2\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.345645 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-combined-ca-bundle\") pod \"keystone-db-sync-7hcz2\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.345692 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jglll\" (UniqueName: \"kubernetes.io/projected/db237d2f-d736-42bd-bd84-dc9d93909367-kube-api-access-jglll\") pod \"keystone-db-sync-7hcz2\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.352111 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62310a9e-9a81-44c2-96f2-9e7064f883e9-kube-api-access-qq6nh" (OuterVolumeSpecName: "kube-api-access-qq6nh") pod "62310a9e-9a81-44c2-96f2-9e7064f883e9" (UID: "62310a9e-9a81-44c2-96f2-9e7064f883e9"). InnerVolumeSpecName "kube-api-access-qq6nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.353320 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-config-data\") pod \"keystone-db-sync-7hcz2\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.354652 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-combined-ca-bundle\") pod \"keystone-db-sync-7hcz2\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.374879 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jglll\" (UniqueName: \"kubernetes.io/projected/db237d2f-d736-42bd-bd84-dc9d93909367-kube-api-access-jglll\") pod \"keystone-db-sync-7hcz2\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.447416 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq6nh\" (UniqueName: \"kubernetes.io/projected/62310a9e-9a81-44c2-96f2-9e7064f883e9-kube-api-access-qq6nh\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.456086 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.541863 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7ffj9"] Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.611539 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dnvd2"] Sep 30 13:53:47 crc kubenswrapper[4763]: W0930 13:53:47.658978 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303b3bc4_dd2a_4f55_8961_31033f17652c.slice/crio-9da99c2e8ff10d96b36a1c3855bee8cfcdfd60b120787849fead2afbabd9d849 WatchSource:0}: Error finding container 9da99c2e8ff10d96b36a1c3855bee8cfcdfd60b120787849fead2afbabd9d849: Status 404 returned error can't find the container with id 9da99c2e8ff10d96b36a1c3855bee8cfcdfd60b120787849fead2afbabd9d849 Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.900890 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dnvd2" event={"ID":"303b3bc4-dd2a-4f55-8961-31033f17652c","Type":"ContainerStarted","Data":"ab5560070012ef542fb548a412733c5149604a4b26330c36843bead4e81247e0"} Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.902151 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dnvd2" event={"ID":"303b3bc4-dd2a-4f55-8961-31033f17652c","Type":"ContainerStarted","Data":"9da99c2e8ff10d96b36a1c3855bee8cfcdfd60b120787849fead2afbabd9d849"} Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.904047 4763 generic.go:334] "Generic (PLEG): container finished" podID="fc62d0c3-527f-414f-ac05-3f788460ed17" containerID="e6f8fa77a2b984e9027d916245f39ff10a0bf44210e52baa8a973f3c8f0e9867" exitCode=0 Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.904146 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" event={"ID":"fc62d0c3-527f-414f-ac05-3f788460ed17","Type":"ContainerDied","Data":"e6f8fa77a2b984e9027d916245f39ff10a0bf44210e52baa8a973f3c8f0e9867"} Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.904181 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" event={"ID":"fc62d0c3-527f-414f-ac05-3f788460ed17","Type":"ContainerStarted","Data":"cbe05eec355533b9f4d42930465b94c9b197c3bf492c968b8c3831c3690062b4"} Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.907146 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e56-account-create-dxz6j" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.907144 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e56-account-create-dxz6j" event={"ID":"62310a9e-9a81-44c2-96f2-9e7064f883e9","Type":"ContainerDied","Data":"463c592547a342dac41c7b351e4e83814f19f0a9c71211f0b60b5e27b4ebfa0a"} Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.907303 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="463c592547a342dac41c7b351e4e83814f19f0a9c71211f0b60b5e27b4ebfa0a" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.910630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7ffj9" event={"ID":"0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e","Type":"ContainerStarted","Data":"b8ccf04d26a147ca6c7ea6f6c38cf982b002fc378fcf0c8ce0053f5a185448b5"} Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.910670 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7ffj9" event={"ID":"0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e","Type":"ContainerStarted","Data":"5df7a7e47d6f2639ce0f35376f5e462d7c8b84a6f3adffad88b53e06bf1a024d"} Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.920690 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zjwcr"] Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.929727 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-dnvd2" podStartSLOduration=1.929707659 podStartE2EDuration="1.929707659s" podCreationTimestamp="2025-09-30 13:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:53:47.926471397 +0000 UTC m=+1100.065031672" watchObservedRunningTime="2025-09-30 13:53:47.929707659 +0000 UTC m=+1100.068267944" Sep 30 13:53:47 crc kubenswrapper[4763]: W0930 13:53:47.933997 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ed6a710_9784_49a1_aa61_59c509f2ff3d.slice/crio-b544c5bbe505601eff18967fe1407202f2f1125ae145a168b05faed1bb77e6a5 WatchSource:0}: Error finding container b544c5bbe505601eff18967fe1407202f2f1125ae145a168b05faed1bb77e6a5: Status 404 returned error can't find the container with id b544c5bbe505601eff18967fe1407202f2f1125ae145a168b05faed1bb77e6a5 Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.970149 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-7ffj9" podStartSLOduration=1.970127993 podStartE2EDuration="1.970127993s" podCreationTimestamp="2025-09-30 13:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:53:47.949425364 +0000 UTC m=+1100.087985649" watchObservedRunningTime="2025-09-30 13:53:47.970127993 +0000 UTC m=+1100.108688278" Sep 30 13:53:47 crc kubenswrapper[4763]: I0930 13:53:47.977081 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7hcz2"] Sep 30 13:53:48 crc kubenswrapper[4763]: W0930 13:53:48.101838 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb237d2f_d736_42bd_bd84_dc9d93909367.slice/crio-932cb6be7b0553d820e930e278bb3e62daf3ea020f9fd8b76f790ca0c2865332 WatchSource:0}: Error finding container 932cb6be7b0553d820e930e278bb3e62daf3ea020f9fd8b76f790ca0c2865332: Status 404 returned error can't find the container with id 932cb6be7b0553d820e930e278bb3e62daf3ea020f9fd8b76f790ca0c2865332 Sep 30 13:53:48 crc kubenswrapper[4763]: I0930 13:53:48.922072 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" event={"ID":"fc62d0c3-527f-414f-ac05-3f788460ed17","Type":"ContainerStarted","Data":"c03167ad2c60e33e2d2dca763f934331e66181e40cf32a7a135bab3e3ac6aa0d"} Sep 30 13:53:48 crc kubenswrapper[4763]: I0930 13:53:48.925004 4763 generic.go:334] "Generic (PLEG): container finished" podID="0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e" containerID="b8ccf04d26a147ca6c7ea6f6c38cf982b002fc378fcf0c8ce0053f5a185448b5" exitCode=0 Sep 30 13:53:48 crc kubenswrapper[4763]: I0930 13:53:48.925218 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7ffj9" event={"ID":"0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e","Type":"ContainerDied","Data":"b8ccf04d26a147ca6c7ea6f6c38cf982b002fc378fcf0c8ce0053f5a185448b5"} Sep 30 13:53:48 crc kubenswrapper[4763]: I0930 13:53:48.927026 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7hcz2" event={"ID":"db237d2f-d736-42bd-bd84-dc9d93909367","Type":"ContainerStarted","Data":"932cb6be7b0553d820e930e278bb3e62daf3ea020f9fd8b76f790ca0c2865332"} Sep 30 13:53:48 crc kubenswrapper[4763]: I0930 13:53:48.932925 4763 generic.go:334] "Generic (PLEG): container finished" podID="303b3bc4-dd2a-4f55-8961-31033f17652c" containerID="ab5560070012ef542fb548a412733c5149604a4b26330c36843bead4e81247e0" exitCode=0 Sep 30 13:53:48 crc kubenswrapper[4763]: I0930 13:53:48.932997 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dnvd2" event={"ID":"303b3bc4-dd2a-4f55-8961-31033f17652c","Type":"ContainerDied","Data":"ab5560070012ef542fb548a412733c5149604a4b26330c36843bead4e81247e0"} Sep 30 13:53:48 crc kubenswrapper[4763]: I0930 13:53:48.937554 4763 generic.go:334] "Generic (PLEG): container finished" podID="3ed6a710-9784-49a1-aa61-59c509f2ff3d" containerID="482012e4eb14524c218a4c31ad892d834468e3f45f8db3a12de6d337c52a9f32" exitCode=0 Sep 30 13:53:48 crc kubenswrapper[4763]: I0930 13:53:48.937618 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zjwcr" event={"ID":"3ed6a710-9784-49a1-aa61-59c509f2ff3d","Type":"ContainerDied","Data":"482012e4eb14524c218a4c31ad892d834468e3f45f8db3a12de6d337c52a9f32"} Sep 30 13:53:48 crc kubenswrapper[4763]: I0930 13:53:48.937646 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zjwcr" event={"ID":"3ed6a710-9784-49a1-aa61-59c509f2ff3d","Type":"ContainerStarted","Data":"b544c5bbe505601eff18967fe1407202f2f1125ae145a168b05faed1bb77e6a5"} Sep 30 13:53:48 crc kubenswrapper[4763]: I0930 13:53:48.949360 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" podStartSLOduration=2.9493396929999998 podStartE2EDuration="2.949339693s" podCreationTimestamp="2025-09-30 13:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:53:48.944769738 +0000 UTC m=+1101.083330023" watchObservedRunningTime="2025-09-30 13:53:48.949339693 +0000 UTC m=+1101.087899978" Sep 30 13:53:49 crc kubenswrapper[4763]: I0930 13:53:49.944768 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:49 crc kubenswrapper[4763]: I0930 13:53:49.989247 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kwz5v" podUID="1db73295-0655-443c-91e0-2cd08b119141" containerName="ovn-controller" probeResult="failure" output=< Sep 30 13:53:49 crc kubenswrapper[4763]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 13:53:49 crc kubenswrapper[4763]: > Sep 30 13:53:50 crc kubenswrapper[4763]: I0930 13:53:50.030300 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.145299 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-q6kdz"] Sep 30 13:53:51 crc kubenswrapper[4763]: E0930 13:53:50.145871 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62310a9e-9a81-44c2-96f2-9e7064f883e9" containerName="mariadb-account-create" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.145891 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="62310a9e-9a81-44c2-96f2-9e7064f883e9" containerName="mariadb-account-create" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.146136 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="62310a9e-9a81-44c2-96f2-9e7064f883e9" containerName="mariadb-account-create" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.146725 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.153870 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-q6kdz"] Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.154053 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.154538 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xn426" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.224145 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-db-sync-config-data\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.224538 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-combined-ca-bundle\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.224583 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57dpd\" (UniqueName: \"kubernetes.io/projected/7c3f0264-cce9-436f-923d-79f807488437-kube-api-access-57dpd\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.224657 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-config-data\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.268635 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kwz5v-config-wj4ll"] Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.269826 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.272238 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.283465 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kwz5v-config-wj4ll"] Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.325554 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-db-sync-config-data\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.325727 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-combined-ca-bundle\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.325768 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57dpd\" (UniqueName: \"kubernetes.io/projected/7c3f0264-cce9-436f-923d-79f807488437-kube-api-access-57dpd\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.325822 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-config-data\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.335330 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-config-data\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.337358 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-combined-ca-bundle\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.344962 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57dpd\" (UniqueName: \"kubernetes.io/projected/7c3f0264-cce9-436f-923d-79f807488437-kube-api-access-57dpd\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.345340 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-db-sync-config-data\") pod \"glance-db-sync-q6kdz\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.427925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-additional-scripts\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.427976 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run-ovn\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.428012 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-scripts\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.428076 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.428149 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8vsj\" (UniqueName: \"kubernetes.io/projected/97702b69-cf07-4ac1-afe4-5afc71b107c9-kube-api-access-q8vsj\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.428307 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-log-ovn\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.489129 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-q6kdz" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.529414 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-additional-scripts\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.529461 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run-ovn\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.529504 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-scripts\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.529551 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.529577 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8vsj\" (UniqueName: \"kubernetes.io/projected/97702b69-cf07-4ac1-afe4-5afc71b107c9-kube-api-access-q8vsj\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.529674 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-log-ovn\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.529764 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-log-ovn\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.529769 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run-ovn\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.529846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.530413 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-additional-scripts\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.531803 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-scripts\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.557470 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8vsj\" (UniqueName: \"kubernetes.io/projected/97702b69-cf07-4ac1-afe4-5afc71b107c9-kube-api-access-q8vsj\") pod \"ovn-controller-kwz5v-config-wj4ll\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:51 crc kubenswrapper[4763]: I0930 13:53:50.608738 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.874277 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zjwcr" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.881968 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7ffj9" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.938404 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dnvd2" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.969442 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2278m\" (UniqueName: \"kubernetes.io/projected/0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e-kube-api-access-2278m\") pod \"0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e\" (UID: \"0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e\") " Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.969548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw2zw\" (UniqueName: \"kubernetes.io/projected/3ed6a710-9784-49a1-aa61-59c509f2ff3d-kube-api-access-rw2zw\") pod \"3ed6a710-9784-49a1-aa61-59c509f2ff3d\" (UID: \"3ed6a710-9784-49a1-aa61-59c509f2ff3d\") " Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.974661 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed6a710-9784-49a1-aa61-59c509f2ff3d-kube-api-access-rw2zw" (OuterVolumeSpecName: "kube-api-access-rw2zw") pod "3ed6a710-9784-49a1-aa61-59c509f2ff3d" (UID: "3ed6a710-9784-49a1-aa61-59c509f2ff3d"). InnerVolumeSpecName "kube-api-access-rw2zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.975317 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e-kube-api-access-2278m" (OuterVolumeSpecName: "kube-api-access-2278m") pod "0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e" (UID: "0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e"). InnerVolumeSpecName "kube-api-access-2278m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.982808 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zjwcr" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.982813 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zjwcr" event={"ID":"3ed6a710-9784-49a1-aa61-59c509f2ff3d","Type":"ContainerDied","Data":"b544c5bbe505601eff18967fe1407202f2f1125ae145a168b05faed1bb77e6a5"} Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.982853 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b544c5bbe505601eff18967fe1407202f2f1125ae145a168b05faed1bb77e6a5" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.984790 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7ffj9" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.984779 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7ffj9" event={"ID":"0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e","Type":"ContainerDied","Data":"5df7a7e47d6f2639ce0f35376f5e462d7c8b84a6f3adffad88b53e06bf1a024d"} Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.984983 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5df7a7e47d6f2639ce0f35376f5e462d7c8b84a6f3adffad88b53e06bf1a024d" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.986256 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dnvd2" event={"ID":"303b3bc4-dd2a-4f55-8961-31033f17652c","Type":"ContainerDied","Data":"9da99c2e8ff10d96b36a1c3855bee8cfcdfd60b120787849fead2afbabd9d849"} Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.986365 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9da99c2e8ff10d96b36a1c3855bee8cfcdfd60b120787849fead2afbabd9d849" Sep 30 13:53:52 crc kubenswrapper[4763]: I0930 13:53:52.986320 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dnvd2" Sep 30 13:53:53 crc kubenswrapper[4763]: I0930 13:53:53.070734 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgbwg\" (UniqueName: \"kubernetes.io/projected/303b3bc4-dd2a-4f55-8961-31033f17652c-kube-api-access-rgbwg\") pod \"303b3bc4-dd2a-4f55-8961-31033f17652c\" (UID: \"303b3bc4-dd2a-4f55-8961-31033f17652c\") " Sep 30 13:53:53 crc kubenswrapper[4763]: I0930 13:53:53.071095 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2278m\" (UniqueName: \"kubernetes.io/projected/0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e-kube-api-access-2278m\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:53 crc kubenswrapper[4763]: I0930 13:53:53.071116 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw2zw\" (UniqueName: \"kubernetes.io/projected/3ed6a710-9784-49a1-aa61-59c509f2ff3d-kube-api-access-rw2zw\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:53 crc kubenswrapper[4763]: I0930 13:53:53.073841 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303b3bc4-dd2a-4f55-8961-31033f17652c-kube-api-access-rgbwg" (OuterVolumeSpecName: "kube-api-access-rgbwg") pod "303b3bc4-dd2a-4f55-8961-31033f17652c" (UID: "303b3bc4-dd2a-4f55-8961-31033f17652c"). InnerVolumeSpecName "kube-api-access-rgbwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:53 crc kubenswrapper[4763]: W0930 13:53:53.171157 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c3f0264_cce9_436f_923d_79f807488437.slice/crio-3bd44a08b3b2854300fa27f2dbc2e4d0eb17a36cb3ac60ccb5c69b81a11f65a7 WatchSource:0}: Error finding container 3bd44a08b3b2854300fa27f2dbc2e4d0eb17a36cb3ac60ccb5c69b81a11f65a7: Status 404 returned error can't find the container with id 3bd44a08b3b2854300fa27f2dbc2e4d0eb17a36cb3ac60ccb5c69b81a11f65a7 Sep 30 13:53:53 crc kubenswrapper[4763]: I0930 13:53:53.172219 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgbwg\" (UniqueName: \"kubernetes.io/projected/303b3bc4-dd2a-4f55-8961-31033f17652c-kube-api-access-rgbwg\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:53 crc kubenswrapper[4763]: I0930 13:53:53.180026 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-q6kdz"] Sep 30 13:53:53 crc kubenswrapper[4763]: I0930 13:53:53.217268 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kwz5v-config-wj4ll"] Sep 30 13:53:53 crc kubenswrapper[4763]: I0930 13:53:53.995010 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-q6kdz" event={"ID":"7c3f0264-cce9-436f-923d-79f807488437","Type":"ContainerStarted","Data":"3bd44a08b3b2854300fa27f2dbc2e4d0eb17a36cb3ac60ccb5c69b81a11f65a7"} Sep 30 13:53:53 crc kubenswrapper[4763]: I0930 13:53:53.997013 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7hcz2" event={"ID":"db237d2f-d736-42bd-bd84-dc9d93909367","Type":"ContainerStarted","Data":"2988167e296d4244bebecb8ac04deccc2b6566a6c43096907b36078148436d85"} Sep 30 13:53:54 crc kubenswrapper[4763]: I0930 13:53:54.000078 4763 generic.go:334] "Generic (PLEG): container finished" podID="97702b69-cf07-4ac1-afe4-5afc71b107c9" containerID="869a8cdf28fb93309ac2159cf23a98dda8d452f10c75857366f1baeb925873e1" exitCode=0 Sep 30 13:53:54 crc kubenswrapper[4763]: I0930 13:53:54.000113 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kwz5v-config-wj4ll" event={"ID":"97702b69-cf07-4ac1-afe4-5afc71b107c9","Type":"ContainerDied","Data":"869a8cdf28fb93309ac2159cf23a98dda8d452f10c75857366f1baeb925873e1"} Sep 30 13:53:54 crc kubenswrapper[4763]: I0930 13:53:54.000131 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kwz5v-config-wj4ll" event={"ID":"97702b69-cf07-4ac1-afe4-5afc71b107c9","Type":"ContainerStarted","Data":"6c444c9769f587b5b1c2ad356b9f71457f1c8f6e99441ac7cd332f79157c5cd2"} Sep 30 13:53:54 crc kubenswrapper[4763]: I0930 13:53:54.044967 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-7hcz2" podStartSLOduration=2.3972739 podStartE2EDuration="7.044943558s" podCreationTimestamp="2025-09-30 13:53:47 +0000 UTC" firstStartedPulling="2025-09-30 13:53:48.111325258 +0000 UTC m=+1100.249885543" lastFinishedPulling="2025-09-30 13:53:52.758994916 +0000 UTC m=+1104.897555201" observedRunningTime="2025-09-30 13:53:54.015912599 +0000 UTC m=+1106.154472884" watchObservedRunningTime="2025-09-30 13:53:54.044943558 +0000 UTC m=+1106.183503843" Sep 30 13:53:54 crc kubenswrapper[4763]: I0930 13:53:54.977002 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-kwz5v" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.335464 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.410576 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run\") pod \"97702b69-cf07-4ac1-afe4-5afc71b107c9\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.410640 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-log-ovn\") pod \"97702b69-cf07-4ac1-afe4-5afc71b107c9\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.410677 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run" (OuterVolumeSpecName: "var-run") pod "97702b69-cf07-4ac1-afe4-5afc71b107c9" (UID: "97702b69-cf07-4ac1-afe4-5afc71b107c9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.410700 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-scripts\") pod \"97702b69-cf07-4ac1-afe4-5afc71b107c9\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.410711 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "97702b69-cf07-4ac1-afe4-5afc71b107c9" (UID: "97702b69-cf07-4ac1-afe4-5afc71b107c9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.410804 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run-ovn\") pod \"97702b69-cf07-4ac1-afe4-5afc71b107c9\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.410830 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8vsj\" (UniqueName: \"kubernetes.io/projected/97702b69-cf07-4ac1-afe4-5afc71b107c9-kube-api-access-q8vsj\") pod \"97702b69-cf07-4ac1-afe4-5afc71b107c9\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.410876 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "97702b69-cf07-4ac1-afe4-5afc71b107c9" (UID: "97702b69-cf07-4ac1-afe4-5afc71b107c9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.410896 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-additional-scripts\") pod \"97702b69-cf07-4ac1-afe4-5afc71b107c9\" (UID: \"97702b69-cf07-4ac1-afe4-5afc71b107c9\") " Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.411240 4763 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.411257 4763 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.411265 4763 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/97702b69-cf07-4ac1-afe4-5afc71b107c9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.411530 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "97702b69-cf07-4ac1-afe4-5afc71b107c9" (UID: "97702b69-cf07-4ac1-afe4-5afc71b107c9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.412089 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-scripts" (OuterVolumeSpecName: "scripts") pod "97702b69-cf07-4ac1-afe4-5afc71b107c9" (UID: "97702b69-cf07-4ac1-afe4-5afc71b107c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.416292 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97702b69-cf07-4ac1-afe4-5afc71b107c9-kube-api-access-q8vsj" (OuterVolumeSpecName: "kube-api-access-q8vsj") pod "97702b69-cf07-4ac1-afe4-5afc71b107c9" (UID: "97702b69-cf07-4ac1-afe4-5afc71b107c9"). InnerVolumeSpecName "kube-api-access-q8vsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.512681 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8vsj\" (UniqueName: \"kubernetes.io/projected/97702b69-cf07-4ac1-afe4-5afc71b107c9-kube-api-access-q8vsj\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.512709 4763 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:55 crc kubenswrapper[4763]: I0930 13:53:55.512721 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97702b69-cf07-4ac1-afe4-5afc71b107c9-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.024001 4763 generic.go:334] "Generic (PLEG): container finished" podID="db237d2f-d736-42bd-bd84-dc9d93909367" containerID="2988167e296d4244bebecb8ac04deccc2b6566a6c43096907b36078148436d85" exitCode=0 Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.024179 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7hcz2" event={"ID":"db237d2f-d736-42bd-bd84-dc9d93909367","Type":"ContainerDied","Data":"2988167e296d4244bebecb8ac04deccc2b6566a6c43096907b36078148436d85"} Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.026395 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kwz5v-config-wj4ll" event={"ID":"97702b69-cf07-4ac1-afe4-5afc71b107c9","Type":"ContainerDied","Data":"6c444c9769f587b5b1c2ad356b9f71457f1c8f6e99441ac7cd332f79157c5cd2"} Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.026421 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c444c9769f587b5b1c2ad356b9f71457f1c8f6e99441ac7cd332f79157c5cd2" Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.026429 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kwz5v-config-wj4ll" Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.425938 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kwz5v-config-wj4ll"] Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.434649 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kwz5v-config-wj4ll"] Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.466800 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.500679 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97702b69-cf07-4ac1-afe4-5afc71b107c9" path="/var/lib/kubelet/pods/97702b69-cf07-4ac1-afe4-5afc71b107c9/volumes" Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.523802 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-slwbw"] Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.524082 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" podUID="9a57f6d1-c09b-410a-af3c-8b3a010da11a" containerName="dnsmasq-dns" containerID="cri-o://c361eaf5f095ffaa66bf4d8a6a114f837e677ebb597ac00b0ddebf3497844457" gracePeriod=10 Sep 30 13:53:56 crc kubenswrapper[4763]: I0930 13:53:56.666531 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" podUID="9a57f6d1-c09b-410a-af3c-8b3a010da11a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.037819 4763 generic.go:334] "Generic (PLEG): container finished" podID="9a57f6d1-c09b-410a-af3c-8b3a010da11a" containerID="c361eaf5f095ffaa66bf4d8a6a114f837e677ebb597ac00b0ddebf3497844457" exitCode=0 Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.037910 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" event={"ID":"9a57f6d1-c09b-410a-af3c-8b3a010da11a","Type":"ContainerDied","Data":"c361eaf5f095ffaa66bf4d8a6a114f837e677ebb597ac00b0ddebf3497844457"} Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.037958 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" event={"ID":"9a57f6d1-c09b-410a-af3c-8b3a010da11a","Type":"ContainerDied","Data":"25ad75ad9fe38cf50b72d806c4759f006f98ae948d7e2d0baf0de5169a082bb8"} Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.037973 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25ad75ad9fe38cf50b72d806c4759f006f98ae948d7e2d0baf0de5169a082bb8" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.045627 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.147809 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l5hx\" (UniqueName: \"kubernetes.io/projected/9a57f6d1-c09b-410a-af3c-8b3a010da11a-kube-api-access-5l5hx\") pod \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.147860 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-sb\") pod \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.147898 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-nb\") pod \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.147963 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-dns-svc\") pod \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.147986 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-config\") pod \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\" (UID: \"9a57f6d1-c09b-410a-af3c-8b3a010da11a\") " Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.164573 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a57f6d1-c09b-410a-af3c-8b3a010da11a-kube-api-access-5l5hx" (OuterVolumeSpecName: "kube-api-access-5l5hx") pod "9a57f6d1-c09b-410a-af3c-8b3a010da11a" (UID: "9a57f6d1-c09b-410a-af3c-8b3a010da11a"). InnerVolumeSpecName "kube-api-access-5l5hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.193324 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a57f6d1-c09b-410a-af3c-8b3a010da11a" (UID: "9a57f6d1-c09b-410a-af3c-8b3a010da11a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.199667 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a57f6d1-c09b-410a-af3c-8b3a010da11a" (UID: "9a57f6d1-c09b-410a-af3c-8b3a010da11a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.218530 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-config" (OuterVolumeSpecName: "config") pod "9a57f6d1-c09b-410a-af3c-8b3a010da11a" (UID: "9a57f6d1-c09b-410a-af3c-8b3a010da11a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.232303 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a57f6d1-c09b-410a-af3c-8b3a010da11a" (UID: "9a57f6d1-c09b-410a-af3c-8b3a010da11a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.250725 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l5hx\" (UniqueName: \"kubernetes.io/projected/9a57f6d1-c09b-410a-af3c-8b3a010da11a-kube-api-access-5l5hx\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.250763 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.250775 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.250801 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.250814 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a57f6d1-c09b-410a-af3c-8b3a010da11a-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.389040 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.560777 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jglll\" (UniqueName: \"kubernetes.io/projected/db237d2f-d736-42bd-bd84-dc9d93909367-kube-api-access-jglll\") pod \"db237d2f-d736-42bd-bd84-dc9d93909367\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.560857 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-combined-ca-bundle\") pod \"db237d2f-d736-42bd-bd84-dc9d93909367\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.560997 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-config-data\") pod \"db237d2f-d736-42bd-bd84-dc9d93909367\" (UID: \"db237d2f-d736-42bd-bd84-dc9d93909367\") " Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.566032 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db237d2f-d736-42bd-bd84-dc9d93909367-kube-api-access-jglll" (OuterVolumeSpecName: "kube-api-access-jglll") pod "db237d2f-d736-42bd-bd84-dc9d93909367" (UID: "db237d2f-d736-42bd-bd84-dc9d93909367"). InnerVolumeSpecName "kube-api-access-jglll". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.596025 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db237d2f-d736-42bd-bd84-dc9d93909367" (UID: "db237d2f-d736-42bd-bd84-dc9d93909367"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.620482 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-config-data" (OuterVolumeSpecName: "config-data") pod "db237d2f-d736-42bd-bd84-dc9d93909367" (UID: "db237d2f-d736-42bd-bd84-dc9d93909367"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.663949 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.663991 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jglll\" (UniqueName: \"kubernetes.io/projected/db237d2f-d736-42bd-bd84-dc9d93909367-kube-api-access-jglll\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:57 crc kubenswrapper[4763]: I0930 13:53:57.664003 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db237d2f-d736-42bd-bd84-dc9d93909367-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.071026 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c6d5d5bd7-slwbw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.071163 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7hcz2" event={"ID":"db237d2f-d736-42bd-bd84-dc9d93909367","Type":"ContainerDied","Data":"932cb6be7b0553d820e930e278bb3e62daf3ea020f9fd8b76f790ca0c2865332"} Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.071212 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="932cb6be7b0553d820e930e278bb3e62daf3ea020f9fd8b76f790ca0c2865332" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.071614 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7hcz2" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.108401 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-slwbw"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.114167 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-slwbw"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.237650 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-5lpdw"] Sep 30 13:53:58 crc kubenswrapper[4763]: E0930 13:53:58.238028 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db237d2f-d736-42bd-bd84-dc9d93909367" containerName="keystone-db-sync" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238050 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="db237d2f-d736-42bd-bd84-dc9d93909367" containerName="keystone-db-sync" Sep 30 13:53:58 crc kubenswrapper[4763]: E0930 13:53:58.238068 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97702b69-cf07-4ac1-afe4-5afc71b107c9" containerName="ovn-config" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238077 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="97702b69-cf07-4ac1-afe4-5afc71b107c9" containerName="ovn-config" Sep 30 13:53:58 crc kubenswrapper[4763]: E0930 13:53:58.238087 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303b3bc4-dd2a-4f55-8961-31033f17652c" containerName="mariadb-database-create" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238094 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="303b3bc4-dd2a-4f55-8961-31033f17652c" containerName="mariadb-database-create" Sep 30 13:53:58 crc kubenswrapper[4763]: E0930 13:53:58.238109 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed6a710-9784-49a1-aa61-59c509f2ff3d" containerName="mariadb-database-create" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238116 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed6a710-9784-49a1-aa61-59c509f2ff3d" containerName="mariadb-database-create" Sep 30 13:53:58 crc kubenswrapper[4763]: E0930 13:53:58.238130 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a57f6d1-c09b-410a-af3c-8b3a010da11a" containerName="init" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238139 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a57f6d1-c09b-410a-af3c-8b3a010da11a" containerName="init" Sep 30 13:53:58 crc kubenswrapper[4763]: E0930 13:53:58.238153 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e" containerName="mariadb-database-create" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238160 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e" containerName="mariadb-database-create" Sep 30 13:53:58 crc kubenswrapper[4763]: E0930 13:53:58.238182 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a57f6d1-c09b-410a-af3c-8b3a010da11a" containerName="dnsmasq-dns" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238191 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a57f6d1-c09b-410a-af3c-8b3a010da11a" containerName="dnsmasq-dns" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238389 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="97702b69-cf07-4ac1-afe4-5afc71b107c9" containerName="ovn-config" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238406 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="db237d2f-d736-42bd-bd84-dc9d93909367" containerName="keystone-db-sync" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238423 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="303b3bc4-dd2a-4f55-8961-31033f17652c" containerName="mariadb-database-create" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238440 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e" containerName="mariadb-database-create" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238454 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed6a710-9784-49a1-aa61-59c509f2ff3d" containerName="mariadb-database-create" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.238467 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a57f6d1-c09b-410a-af3c-8b3a010da11a" containerName="dnsmasq-dns" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.239551 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.262711 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-5lpdw"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.307547 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6ncgn"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.308558 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.314913 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.315041 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.315139 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.315269 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pj6pt" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.330810 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6ncgn"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.378128 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tlgb\" (UniqueName: \"kubernetes.io/projected/84e463e9-da58-476b-8c7d-5d7745c86118-kube-api-access-5tlgb\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.378437 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-swift-storage-0\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.378465 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-nb\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.378484 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-config\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.378553 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-sb\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.378588 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-svc\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.449696 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.454839 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.463738 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.465691 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.468024 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480395 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-combined-ca-bundle\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480466 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tlgb\" (UniqueName: \"kubernetes.io/projected/84e463e9-da58-476b-8c7d-5d7745c86118-kube-api-access-5tlgb\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480496 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-swift-storage-0\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480518 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-nb\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480536 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-config\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480583 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-fernet-keys\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480623 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-config-data\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480640 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-sb\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480656 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-credential-keys\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480674 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j659h\" (UniqueName: \"kubernetes.io/projected/a79babe8-df95-418a-a358-f161f40fc4ef-kube-api-access-j659h\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480702 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-svc\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.480717 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-scripts\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.481853 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-swift-storage-0\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.482385 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-nb\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.482893 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-config\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.483457 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-sb\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.483693 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-svc\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.501743 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a57f6d1-c09b-410a-af3c-8b3a010da11a" path="/var/lib/kubelet/pods/9a57f6d1-c09b-410a-af3c-8b3a010da11a/volumes" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.537623 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tlgb\" (UniqueName: \"kubernetes.io/projected/84e463e9-da58-476b-8c7d-5d7745c86118-kube-api-access-5tlgb\") pod \"dnsmasq-dns-7f4777664c-5lpdw\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.566823 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582223 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjqgm\" (UniqueName: \"kubernetes.io/projected/1baa8aa7-e856-4478-8662-26f094036b18-kube-api-access-tjqgm\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582281 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-log-httpd\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582317 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-run-httpd\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582341 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-fernet-keys\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582370 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-scripts\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582395 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-config-data\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582418 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-credential-keys\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582436 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-config-data\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582474 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j659h\" (UniqueName: \"kubernetes.io/projected/a79babe8-df95-418a-a358-f161f40fc4ef-kube-api-access-j659h\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582501 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582526 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582553 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-scripts\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.582592 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-combined-ca-bundle\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.588789 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-scripts\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.591989 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-credential-keys\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.593037 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-config-data\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.595929 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-combined-ca-bundle\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.632777 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j659h\" (UniqueName: \"kubernetes.io/projected/a79babe8-df95-418a-a358-f161f40fc4ef-kube-api-access-j659h\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.641929 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-fernet-keys\") pod \"keystone-bootstrap-6ncgn\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.642317 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.676431 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-5lpdw"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.688043 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjqgm\" (UniqueName: \"kubernetes.io/projected/1baa8aa7-e856-4478-8662-26f094036b18-kube-api-access-tjqgm\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.688087 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-log-httpd\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.688115 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-run-httpd\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.688139 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-scripts\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.688159 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-config-data\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.688182 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.688202 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.689187 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-run-httpd\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.695892 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-log-httpd\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.700151 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-scripts\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.708711 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.715066 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.717755 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-config-data\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.717840 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-h46w6"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.719926 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.724509 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-r6p2n"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.731545 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.742041 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.742217 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8m22d" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.742360 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.742458 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-h46w6"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.745992 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjqgm\" (UniqueName: \"kubernetes.io/projected/1baa8aa7-e856-4478-8662-26f094036b18-kube-api-access-tjqgm\") pod \"ceilometer-0\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.773643 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-r6p2n"] Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.791143 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.891737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-svc\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.892046 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-scripts\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.892076 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt5s7\" (UniqueName: \"kubernetes.io/projected/6386172b-d1f8-4dd4-897e-cd58a6acf678-kube-api-access-mt5s7\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.892134 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-combined-ca-bundle\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.892159 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56447315-e00d-4a65-9ee4-c58432d2ebca-logs\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.892188 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-config-data\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.892230 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-swift-storage-0\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.892273 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-sb\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.892293 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d9mk\" (UniqueName: \"kubernetes.io/projected/56447315-e00d-4a65-9ee4-c58432d2ebca-kube-api-access-5d9mk\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.892318 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-config\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:58 crc kubenswrapper[4763]: I0930 13:53:58.892337 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-nb\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.998698 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-swift-storage-0\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.998778 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-sb\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.998804 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d9mk\" (UniqueName: \"kubernetes.io/projected/56447315-e00d-4a65-9ee4-c58432d2ebca-kube-api-access-5d9mk\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.998835 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-config\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.998860 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-nb\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.998903 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-svc\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.998965 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-scripts\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.998991 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt5s7\" (UniqueName: \"kubernetes.io/projected/6386172b-d1f8-4dd4-897e-cd58a6acf678-kube-api-access-mt5s7\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.999043 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-combined-ca-bundle\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.999063 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56447315-e00d-4a65-9ee4-c58432d2ebca-logs\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:58.999103 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-config-data\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.000660 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-swift-storage-0\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.001384 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-nb\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.001423 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-sb\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.001959 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-svc\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.004988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56447315-e00d-4a65-9ee4-c58432d2ebca-logs\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.005843 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-config\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.015007 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-scripts\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.016202 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-combined-ca-bundle\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.017513 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-config-data\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.028526 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt5s7\" (UniqueName: \"kubernetes.io/projected/6386172b-d1f8-4dd4-897e-cd58a6acf678-kube-api-access-mt5s7\") pod \"dnsmasq-dns-b4dc449d9-h46w6\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.034409 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d9mk\" (UniqueName: \"kubernetes.io/projected/56447315-e00d-4a65-9ee4-c58432d2ebca-kube-api-access-5d9mk\") pod \"placement-db-sync-r6p2n\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.117232 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.128703 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-r6p2n" Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.344698 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-5lpdw"] Sep 30 13:53:59 crc kubenswrapper[4763]: W0930 13:53:59.350851 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e463e9_da58_476b_8c7d_5d7745c86118.slice/crio-61fc77fe2e228c79e74005db7aa747d7bcbdf4e6967a35e5d69591cae5bb29bd WatchSource:0}: Error finding container 61fc77fe2e228c79e74005db7aa747d7bcbdf4e6967a35e5d69591cae5bb29bd: Status 404 returned error can't find the container with id 61fc77fe2e228c79e74005db7aa747d7bcbdf4e6967a35e5d69591cae5bb29bd Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.526527 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6ncgn"] Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.610428 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:53:59 crc kubenswrapper[4763]: W0930 13:53:59.616476 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1baa8aa7_e856_4478_8662_26f094036b18.slice/crio-b2b7228adf60393419d81c71e98349fdeea84d300b80076976e5d9d722d2bf8f WatchSource:0}: Error finding container b2b7228adf60393419d81c71e98349fdeea84d300b80076976e5d9d722d2bf8f: Status 404 returned error can't find the container with id b2b7228adf60393419d81c71e98349fdeea84d300b80076976e5d9d722d2bf8f Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.722075 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-h46w6"] Sep 30 13:53:59 crc kubenswrapper[4763]: I0930 13:53:59.730474 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-r6p2n"] Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.116981 4763 generic.go:334] "Generic (PLEG): container finished" podID="84e463e9-da58-476b-8c7d-5d7745c86118" containerID="06f90a8223284603c87f0e7ebd7865ee596e76b986c9aeb20afa781534dbc891" exitCode=0 Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.117053 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" event={"ID":"84e463e9-da58-476b-8c7d-5d7745c86118","Type":"ContainerDied","Data":"06f90a8223284603c87f0e7ebd7865ee596e76b986c9aeb20afa781534dbc891"} Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.117081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" event={"ID":"84e463e9-da58-476b-8c7d-5d7745c86118","Type":"ContainerStarted","Data":"61fc77fe2e228c79e74005db7aa747d7bcbdf4e6967a35e5d69591cae5bb29bd"} Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.121264 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1baa8aa7-e856-4478-8662-26f094036b18","Type":"ContainerStarted","Data":"b2b7228adf60393419d81c71e98349fdeea84d300b80076976e5d9d722d2bf8f"} Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.127334 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" event={"ID":"6386172b-d1f8-4dd4-897e-cd58a6acf678","Type":"ContainerStarted","Data":"e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815"} Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.127393 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" event={"ID":"6386172b-d1f8-4dd4-897e-cd58a6acf678","Type":"ContainerStarted","Data":"d7b88f1a9ebf6a7a57db309be0ed500a8edc9cfe637265e03d060ad66e389c0c"} Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.128720 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-r6p2n" event={"ID":"56447315-e00d-4a65-9ee4-c58432d2ebca","Type":"ContainerStarted","Data":"7e48614cdf4555f5669e703f01b52277f13bb979016a8d05aa9b3543a6e0800c"} Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.141774 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ncgn" event={"ID":"a79babe8-df95-418a-a358-f161f40fc4ef","Type":"ContainerStarted","Data":"fd79bb983a126745103e9f7ac0ec7a6f63a3419ae638e5a9063e7fdc0f374214"} Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.141817 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ncgn" event={"ID":"a79babe8-df95-418a-a358-f161f40fc4ef","Type":"ContainerStarted","Data":"0878e0c1265c8960739868d7ba1e40522b1857108a980764e841ec8cd1a4c25e"} Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.203069 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6ncgn" podStartSLOduration=2.203046215 podStartE2EDuration="2.203046215s" podCreationTimestamp="2025-09-30 13:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:54:00.186267934 +0000 UTC m=+1112.324828219" watchObservedRunningTime="2025-09-30 13:54:00.203046215 +0000 UTC m=+1112.341606500" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.477179 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.548323 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-nb\") pod \"84e463e9-da58-476b-8c7d-5d7745c86118\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.548759 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tlgb\" (UniqueName: \"kubernetes.io/projected/84e463e9-da58-476b-8c7d-5d7745c86118-kube-api-access-5tlgb\") pod \"84e463e9-da58-476b-8c7d-5d7745c86118\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.548785 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-svc\") pod \"84e463e9-da58-476b-8c7d-5d7745c86118\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.548816 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-swift-storage-0\") pod \"84e463e9-da58-476b-8c7d-5d7745c86118\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.548839 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-config\") pod \"84e463e9-da58-476b-8c7d-5d7745c86118\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.548946 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-sb\") pod \"84e463e9-da58-476b-8c7d-5d7745c86118\" (UID: \"84e463e9-da58-476b-8c7d-5d7745c86118\") " Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.559848 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e463e9-da58-476b-8c7d-5d7745c86118-kube-api-access-5tlgb" (OuterVolumeSpecName: "kube-api-access-5tlgb") pod "84e463e9-da58-476b-8c7d-5d7745c86118" (UID: "84e463e9-da58-476b-8c7d-5d7745c86118"). InnerVolumeSpecName "kube-api-access-5tlgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.576929 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.580245 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84e463e9-da58-476b-8c7d-5d7745c86118" (UID: "84e463e9-da58-476b-8c7d-5d7745c86118"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.585201 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84e463e9-da58-476b-8c7d-5d7745c86118" (UID: "84e463e9-da58-476b-8c7d-5d7745c86118"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.587006 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84e463e9-da58-476b-8c7d-5d7745c86118" (UID: "84e463e9-da58-476b-8c7d-5d7745c86118"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.597150 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "84e463e9-da58-476b-8c7d-5d7745c86118" (UID: "84e463e9-da58-476b-8c7d-5d7745c86118"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.598228 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-config" (OuterVolumeSpecName: "config") pod "84e463e9-da58-476b-8c7d-5d7745c86118" (UID: "84e463e9-da58-476b-8c7d-5d7745c86118"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.650589 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.650636 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.650649 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tlgb\" (UniqueName: \"kubernetes.io/projected/84e463e9-da58-476b-8c7d-5d7745c86118-kube-api-access-5tlgb\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.650665 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.650676 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:00 crc kubenswrapper[4763]: I0930 13:54:00.650686 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e463e9-da58-476b-8c7d-5d7745c86118-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:01 crc kubenswrapper[4763]: I0930 13:54:01.153138 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" event={"ID":"84e463e9-da58-476b-8c7d-5d7745c86118","Type":"ContainerDied","Data":"61fc77fe2e228c79e74005db7aa747d7bcbdf4e6967a35e5d69591cae5bb29bd"} Sep 30 13:54:01 crc kubenswrapper[4763]: I0930 13:54:01.153196 4763 scope.go:117] "RemoveContainer" containerID="06f90a8223284603c87f0e7ebd7865ee596e76b986c9aeb20afa781534dbc891" Sep 30 13:54:01 crc kubenswrapper[4763]: I0930 13:54:01.153324 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4777664c-5lpdw" Sep 30 13:54:01 crc kubenswrapper[4763]: I0930 13:54:01.163896 4763 generic.go:334] "Generic (PLEG): container finished" podID="6386172b-d1f8-4dd4-897e-cd58a6acf678" containerID="e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815" exitCode=0 Sep 30 13:54:01 crc kubenswrapper[4763]: I0930 13:54:01.163985 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" event={"ID":"6386172b-d1f8-4dd4-897e-cd58a6acf678","Type":"ContainerDied","Data":"e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815"} Sep 30 13:54:01 crc kubenswrapper[4763]: I0930 13:54:01.164030 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" event={"ID":"6386172b-d1f8-4dd4-897e-cd58a6acf678","Type":"ContainerStarted","Data":"8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927"} Sep 30 13:54:01 crc kubenswrapper[4763]: I0930 13:54:01.164398 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:54:01 crc kubenswrapper[4763]: I0930 13:54:01.185981 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" podStartSLOduration=3.185963373 podStartE2EDuration="3.185963373s" podCreationTimestamp="2025-09-30 13:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:54:01.181761957 +0000 UTC m=+1113.320322262" watchObservedRunningTime="2025-09-30 13:54:01.185963373 +0000 UTC m=+1113.324523648" Sep 30 13:54:01 crc kubenswrapper[4763]: I0930 13:54:01.235203 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-5lpdw"] Sep 30 13:54:01 crc kubenswrapper[4763]: I0930 13:54:01.254110 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-5lpdw"] Sep 30 13:54:02 crc kubenswrapper[4763]: I0930 13:54:02.500973 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e463e9-da58-476b-8c7d-5d7745c86118" path="/var/lib/kubelet/pods/84e463e9-da58-476b-8c7d-5d7745c86118/volumes" Sep 30 13:54:04 crc kubenswrapper[4763]: I0930 13:54:04.233815 4763 generic.go:334] "Generic (PLEG): container finished" podID="a79babe8-df95-418a-a358-f161f40fc4ef" containerID="fd79bb983a126745103e9f7ac0ec7a6f63a3419ae638e5a9063e7fdc0f374214" exitCode=0 Sep 30 13:54:04 crc kubenswrapper[4763]: I0930 13:54:04.233913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ncgn" event={"ID":"a79babe8-df95-418a-a358-f161f40fc4ef","Type":"ContainerDied","Data":"fd79bb983a126745103e9f7ac0ec7a6f63a3419ae638e5a9063e7fdc0f374214"} Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.060045 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.060724 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.060777 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.062963 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"929286b0798b4123a28e4fd7afc0d057a5a3facafe7726db3c5285288ca63279"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.063030 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://929286b0798b4123a28e4fd7afc0d057a5a3facafe7726db3c5285288ca63279" gracePeriod=600 Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.765547 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-41ee-account-create-256xb"] Sep 30 13:54:06 crc kubenswrapper[4763]: E0930 13:54:06.765946 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e463e9-da58-476b-8c7d-5d7745c86118" containerName="init" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.765966 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e463e9-da58-476b-8c7d-5d7745c86118" containerName="init" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.766141 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e463e9-da58-476b-8c7d-5d7745c86118" containerName="init" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.766781 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-41ee-account-create-256xb" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.769304 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.777113 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-41ee-account-create-256xb"] Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.778946 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s694d\" (UniqueName: \"kubernetes.io/projected/1942c414-009b-4326-be52-5cf277802681-kube-api-access-s694d\") pod \"barbican-41ee-account-create-256xb\" (UID: \"1942c414-009b-4326-be52-5cf277802681\") " pod="openstack/barbican-41ee-account-create-256xb" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.882898 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s694d\" (UniqueName: \"kubernetes.io/projected/1942c414-009b-4326-be52-5cf277802681-kube-api-access-s694d\") pod \"barbican-41ee-account-create-256xb\" (UID: \"1942c414-009b-4326-be52-5cf277802681\") " pod="openstack/barbican-41ee-account-create-256xb" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.903268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s694d\" (UniqueName: \"kubernetes.io/projected/1942c414-009b-4326-be52-5cf277802681-kube-api-access-s694d\") pod \"barbican-41ee-account-create-256xb\" (UID: \"1942c414-009b-4326-be52-5cf277802681\") " pod="openstack/barbican-41ee-account-create-256xb" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.971333 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-97e0-account-create-72gbj"] Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.972854 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-97e0-account-create-72gbj" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.977787 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 13:54:06 crc kubenswrapper[4763]: I0930 13:54:06.980919 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-97e0-account-create-72gbj"] Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.068193 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a40b-account-create-kpccm"] Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.069420 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a40b-account-create-kpccm" Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.072628 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.079958 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a40b-account-create-kpccm"] Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.092096 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-41ee-account-create-256xb" Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.095380 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs8hh\" (UniqueName: \"kubernetes.io/projected/7033b478-9e40-41ce-9c65-80a3c5c1273f-kube-api-access-xs8hh\") pod \"cinder-97e0-account-create-72gbj\" (UID: \"7033b478-9e40-41ce-9c65-80a3c5c1273f\") " pod="openstack/cinder-97e0-account-create-72gbj" Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.095449 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgdjk\" (UniqueName: \"kubernetes.io/projected/9c7985fd-8e51-4b64-8477-bd05c3577312-kube-api-access-pgdjk\") pod \"neutron-a40b-account-create-kpccm\" (UID: \"9c7985fd-8e51-4b64-8477-bd05c3577312\") " pod="openstack/neutron-a40b-account-create-kpccm" Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.196610 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs8hh\" (UniqueName: \"kubernetes.io/projected/7033b478-9e40-41ce-9c65-80a3c5c1273f-kube-api-access-xs8hh\") pod \"cinder-97e0-account-create-72gbj\" (UID: \"7033b478-9e40-41ce-9c65-80a3c5c1273f\") " pod="openstack/cinder-97e0-account-create-72gbj" Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.196699 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgdjk\" (UniqueName: \"kubernetes.io/projected/9c7985fd-8e51-4b64-8477-bd05c3577312-kube-api-access-pgdjk\") pod \"neutron-a40b-account-create-kpccm\" (UID: \"9c7985fd-8e51-4b64-8477-bd05c3577312\") " pod="openstack/neutron-a40b-account-create-kpccm" Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.231145 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgdjk\" (UniqueName: \"kubernetes.io/projected/9c7985fd-8e51-4b64-8477-bd05c3577312-kube-api-access-pgdjk\") pod \"neutron-a40b-account-create-kpccm\" (UID: \"9c7985fd-8e51-4b64-8477-bd05c3577312\") " pod="openstack/neutron-a40b-account-create-kpccm" Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.241183 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs8hh\" (UniqueName: \"kubernetes.io/projected/7033b478-9e40-41ce-9c65-80a3c5c1273f-kube-api-access-xs8hh\") pod \"cinder-97e0-account-create-72gbj\" (UID: \"7033b478-9e40-41ce-9c65-80a3c5c1273f\") " pod="openstack/cinder-97e0-account-create-72gbj" Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.296122 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-97e0-account-create-72gbj" Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.317344 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="929286b0798b4123a28e4fd7afc0d057a5a3facafe7726db3c5285288ca63279" exitCode=0 Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.317385 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"929286b0798b4123a28e4fd7afc0d057a5a3facafe7726db3c5285288ca63279"} Sep 30 13:54:07 crc kubenswrapper[4763]: I0930 13:54:07.401515 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a40b-account-create-kpccm" Sep 30 13:54:09 crc kubenswrapper[4763]: I0930 13:54:09.119346 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:54:09 crc kubenswrapper[4763]: I0930 13:54:09.181759 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-jllqw"] Sep 30 13:54:09 crc kubenswrapper[4763]: I0930 13:54:09.182010 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" podUID="fc62d0c3-527f-414f-ac05-3f788460ed17" containerName="dnsmasq-dns" containerID="cri-o://c03167ad2c60e33e2d2dca763f934331e66181e40cf32a7a135bab3e3ac6aa0d" gracePeriod=10 Sep 30 13:54:11 crc kubenswrapper[4763]: I0930 13:54:11.466316 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" podUID="fc62d0c3-527f-414f-ac05-3f788460ed17" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Sep 30 13:54:13 crc kubenswrapper[4763]: E0930 13:54:13.549093 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070" Sep 30 13:54:13 crc kubenswrapper[4763]: E0930 13:54:13.549564 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57dpd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-q6kdz_openstack(7c3f0264-cce9-436f-923d-79f807488437): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:54:13 crc kubenswrapper[4763]: E0930 13:54:13.551018 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-q6kdz" podUID="7c3f0264-cce9-436f-923d-79f807488437" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.611157 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.813141 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-scripts\") pod \"a79babe8-df95-418a-a358-f161f40fc4ef\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.813320 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-config-data\") pod \"a79babe8-df95-418a-a358-f161f40fc4ef\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.813348 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-combined-ca-bundle\") pod \"a79babe8-df95-418a-a358-f161f40fc4ef\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.813365 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-credential-keys\") pod \"a79babe8-df95-418a-a358-f161f40fc4ef\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.813393 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j659h\" (UniqueName: \"kubernetes.io/projected/a79babe8-df95-418a-a358-f161f40fc4ef-kube-api-access-j659h\") pod \"a79babe8-df95-418a-a358-f161f40fc4ef\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.813413 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-fernet-keys\") pod \"a79babe8-df95-418a-a358-f161f40fc4ef\" (UID: \"a79babe8-df95-418a-a358-f161f40fc4ef\") " Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.822081 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-scripts" (OuterVolumeSpecName: "scripts") pod "a79babe8-df95-418a-a358-f161f40fc4ef" (UID: "a79babe8-df95-418a-a358-f161f40fc4ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.822110 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a79babe8-df95-418a-a358-f161f40fc4ef" (UID: "a79babe8-df95-418a-a358-f161f40fc4ef"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.822138 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79babe8-df95-418a-a358-f161f40fc4ef-kube-api-access-j659h" (OuterVolumeSpecName: "kube-api-access-j659h") pod "a79babe8-df95-418a-a358-f161f40fc4ef" (UID: "a79babe8-df95-418a-a358-f161f40fc4ef"). InnerVolumeSpecName "kube-api-access-j659h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.825775 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a79babe8-df95-418a-a358-f161f40fc4ef" (UID: "a79babe8-df95-418a-a358-f161f40fc4ef"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.842703 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-config-data" (OuterVolumeSpecName: "config-data") pod "a79babe8-df95-418a-a358-f161f40fc4ef" (UID: "a79babe8-df95-418a-a358-f161f40fc4ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.851146 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a79babe8-df95-418a-a358-f161f40fc4ef" (UID: "a79babe8-df95-418a-a358-f161f40fc4ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.915214 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.915239 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.915250 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.915258 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j659h\" (UniqueName: \"kubernetes.io/projected/a79babe8-df95-418a-a358-f161f40fc4ef-kube-api-access-j659h\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.915267 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:13 crc kubenswrapper[4763]: I0930 13:54:13.915275 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79babe8-df95-418a-a358-f161f40fc4ef-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.390670 4763 generic.go:334] "Generic (PLEG): container finished" podID="fc62d0c3-527f-414f-ac05-3f788460ed17" containerID="c03167ad2c60e33e2d2dca763f934331e66181e40cf32a7a135bab3e3ac6aa0d" exitCode=0 Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.390728 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" event={"ID":"fc62d0c3-527f-414f-ac05-3f788460ed17","Type":"ContainerDied","Data":"c03167ad2c60e33e2d2dca763f934331e66181e40cf32a7a135bab3e3ac6aa0d"} Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.392573 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ncgn" event={"ID":"a79babe8-df95-418a-a358-f161f40fc4ef","Type":"ContainerDied","Data":"0878e0c1265c8960739868d7ba1e40522b1857108a980764e841ec8cd1a4c25e"} Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.392639 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0878e0c1265c8960739868d7ba1e40522b1857108a980764e841ec8cd1a4c25e" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.392582 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ncgn" Sep 30 13:54:14 crc kubenswrapper[4763]: E0930 13:54:14.394729 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070\\\"\"" pod="openstack/glance-db-sync-q6kdz" podUID="7c3f0264-cce9-436f-923d-79f807488437" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.680390 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6ncgn"] Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.686291 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6ncgn"] Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.794019 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jpdl6"] Sep 30 13:54:14 crc kubenswrapper[4763]: E0930 13:54:14.794446 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79babe8-df95-418a-a358-f161f40fc4ef" containerName="keystone-bootstrap" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.794469 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79babe8-df95-418a-a358-f161f40fc4ef" containerName="keystone-bootstrap" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.794703 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79babe8-df95-418a-a358-f161f40fc4ef" containerName="keystone-bootstrap" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.798384 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.800490 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.800544 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pj6pt" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.800651 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.800847 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.804932 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jpdl6"] Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.832195 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-credential-keys\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.832274 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-scripts\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.832330 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-combined-ca-bundle\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.832353 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxtvv\" (UniqueName: \"kubernetes.io/projected/9d37da4a-5377-4d05-93c5-04f933f77894-kube-api-access-qxtvv\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.832373 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-fernet-keys\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.832455 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-config-data\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.933540 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-scripts\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.933629 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-combined-ca-bundle\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.933660 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxtvv\" (UniqueName: \"kubernetes.io/projected/9d37da4a-5377-4d05-93c5-04f933f77894-kube-api-access-qxtvv\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.933682 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-fernet-keys\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.933721 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-config-data\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.933747 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-credential-keys\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.937794 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-config-data\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.937413 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-scripts\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.938213 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-fernet-keys\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.942956 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-credential-keys\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.947467 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-combined-ca-bundle\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:14 crc kubenswrapper[4763]: I0930 13:54:14.952940 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxtvv\" (UniqueName: \"kubernetes.io/projected/9d37da4a-5377-4d05-93c5-04f933f77894-kube-api-access-qxtvv\") pod \"keystone-bootstrap-jpdl6\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:15 crc kubenswrapper[4763]: I0930 13:54:15.127368 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:15 crc kubenswrapper[4763]: I0930 13:54:15.321805 4763 scope.go:117] "RemoveContainer" containerID="2d22e10272d584e0d311db86eff7ac75db8f98341b6f7b1a40cf7584027c1ba8" Sep 30 13:54:16 crc kubenswrapper[4763]: I0930 13:54:16.499013 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79babe8-df95-418a-a358-f161f40fc4ef" path="/var/lib/kubelet/pods/a79babe8-df95-418a-a358-f161f40fc4ef/volumes" Sep 30 13:54:16 crc kubenswrapper[4763]: E0930 13:54:16.679924 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:9a3671dee1752ebe3639a0b16de95d29e779f1629d563e0585d65b9792542fc9" Sep 30 13:54:16 crc kubenswrapper[4763]: E0930 13:54:16.680167 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:9a3671dee1752ebe3639a0b16de95d29e779f1629d563e0585d65b9792542fc9,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5d9mk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-r6p2n_openstack(56447315-e00d-4a65-9ee4-c58432d2ebca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:54:16 crc kubenswrapper[4763]: E0930 13:54:16.681408 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-r6p2n" podUID="56447315-e00d-4a65-9ee4-c58432d2ebca" Sep 30 13:54:16 crc kubenswrapper[4763]: I0930 13:54:16.779881 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:54:16 crc kubenswrapper[4763]: I0930 13:54:16.963419 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-nb\") pod \"fc62d0c3-527f-414f-ac05-3f788460ed17\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " Sep 30 13:54:16 crc kubenswrapper[4763]: I0930 13:54:16.963544 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-config\") pod \"fc62d0c3-527f-414f-ac05-3f788460ed17\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " Sep 30 13:54:16 crc kubenswrapper[4763]: I0930 13:54:16.963621 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-swift-storage-0\") pod \"fc62d0c3-527f-414f-ac05-3f788460ed17\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " Sep 30 13:54:16 crc kubenswrapper[4763]: I0930 13:54:16.963675 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-svc\") pod \"fc62d0c3-527f-414f-ac05-3f788460ed17\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " Sep 30 13:54:16 crc kubenswrapper[4763]: I0930 13:54:16.963719 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw94r\" (UniqueName: \"kubernetes.io/projected/fc62d0c3-527f-414f-ac05-3f788460ed17-kube-api-access-lw94r\") pod \"fc62d0c3-527f-414f-ac05-3f788460ed17\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " Sep 30 13:54:16 crc kubenswrapper[4763]: I0930 13:54:16.963753 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-sb\") pod \"fc62d0c3-527f-414f-ac05-3f788460ed17\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " Sep 30 13:54:16 crc kubenswrapper[4763]: I0930 13:54:16.969861 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc62d0c3-527f-414f-ac05-3f788460ed17-kube-api-access-lw94r" (OuterVolumeSpecName: "kube-api-access-lw94r") pod "fc62d0c3-527f-414f-ac05-3f788460ed17" (UID: "fc62d0c3-527f-414f-ac05-3f788460ed17"). InnerVolumeSpecName "kube-api-access-lw94r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.004930 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc62d0c3-527f-414f-ac05-3f788460ed17" (UID: "fc62d0c3-527f-414f-ac05-3f788460ed17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.005648 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc62d0c3-527f-414f-ac05-3f788460ed17" (UID: "fc62d0c3-527f-414f-ac05-3f788460ed17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:54:17 crc kubenswrapper[4763]: E0930 13:54:17.018944 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-swift-storage-0 podName:fc62d0c3-527f-414f-ac05-3f788460ed17 nodeName:}" failed. No retries permitted until 2025-09-30 13:54:17.518918265 +0000 UTC m=+1129.657478550 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-swift-storage-0") pod "fc62d0c3-527f-414f-ac05-3f788460ed17" (UID: "fc62d0c3-527f-414f-ac05-3f788460ed17") : error deleting /var/lib/kubelet/pods/fc62d0c3-527f-414f-ac05-3f788460ed17/volume-subpaths: remove /var/lib/kubelet/pods/fc62d0c3-527f-414f-ac05-3f788460ed17/volume-subpaths: no such file or directory Sep 30 13:54:17 crc kubenswrapper[4763]: E0930 13:54:17.018978 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-svc podName:fc62d0c3-527f-414f-ac05-3f788460ed17 nodeName:}" failed. No retries permitted until 2025-09-30 13:54:17.518970876 +0000 UTC m=+1129.657531161 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-svc") pod "fc62d0c3-527f-414f-ac05-3f788460ed17" (UID: "fc62d0c3-527f-414f-ac05-3f788460ed17") : error deleting /var/lib/kubelet/pods/fc62d0c3-527f-414f-ac05-3f788460ed17/volume-subpaths: remove /var/lib/kubelet/pods/fc62d0c3-527f-414f-ac05-3f788460ed17/volume-subpaths: no such file or directory Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.019247 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-config" (OuterVolumeSpecName: "config") pod "fc62d0c3-527f-414f-ac05-3f788460ed17" (UID: "fc62d0c3-527f-414f-ac05-3f788460ed17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.066743 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.066783 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.066795 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw94r\" (UniqueName: \"kubernetes.io/projected/fc62d0c3-527f-414f-ac05-3f788460ed17-kube-api-access-lw94r\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.066806 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:17 crc kubenswrapper[4763]: E0930 13:54:17.079703 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:32a25ac44706b73bff04a89514177b1efd675f0442b295e225f0020555ca6350" Sep 30 13:54:17 crc kubenswrapper[4763]: E0930 13:54:17.079911 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:32a25ac44706b73bff04a89514177b1efd675f0442b295e225f0020555ca6350,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n687hcfh5d7h58fh678h6ch5cbh564h5bch5c5h675h696hffhddhd5h9bh644h5c6h74hfdh677h686h5bhcdh98h8bh79h5fdh669h99h646h5b8q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tjqgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1baa8aa7-e856-4478-8662-26f094036b18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.429885 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"93b97d46ec993310482c9f94e284fd8475a6addbce7a122971ed13904ff04071"} Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.432036 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" event={"ID":"fc62d0c3-527f-414f-ac05-3f788460ed17","Type":"ContainerDied","Data":"cbe05eec355533b9f4d42930465b94c9b197c3bf492c968b8c3831c3690062b4"} Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.432083 4763 scope.go:117] "RemoveContainer" containerID="c03167ad2c60e33e2d2dca763f934331e66181e40cf32a7a135bab3e3ac6aa0d" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.432093 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" Sep 30 13:54:17 crc kubenswrapper[4763]: E0930 13:54:17.434855 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:9a3671dee1752ebe3639a0b16de95d29e779f1629d563e0585d65b9792542fc9\\\"\"" pod="openstack/placement-db-sync-r6p2n" podUID="56447315-e00d-4a65-9ee4-c58432d2ebca" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.459703 4763 scope.go:117] "RemoveContainer" containerID="e6f8fa77a2b984e9027d916245f39ff10a0bf44210e52baa8a973f3c8f0e9867" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.519826 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-97e0-account-create-72gbj"] Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.574048 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-swift-storage-0\") pod \"fc62d0c3-527f-414f-ac05-3f788460ed17\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.574369 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-svc\") pod \"fc62d0c3-527f-414f-ac05-3f788460ed17\" (UID: \"fc62d0c3-527f-414f-ac05-3f788460ed17\") " Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.574897 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc62d0c3-527f-414f-ac05-3f788460ed17" (UID: "fc62d0c3-527f-414f-ac05-3f788460ed17"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.575019 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc62d0c3-527f-414f-ac05-3f788460ed17" (UID: "fc62d0c3-527f-414f-ac05-3f788460ed17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.575379 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.575484 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc62d0c3-527f-414f-ac05-3f788460ed17-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.583667 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-41ee-account-create-256xb"] Sep 30 13:54:17 crc kubenswrapper[4763]: W0930 13:54:17.589341 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1942c414_009b_4326_be52_5cf277802681.slice/crio-c2827ac1e7d7b863c75ea729c5164e8c58676e455938eefb473fe11a1c55ff5a WatchSource:0}: Error finding container c2827ac1e7d7b863c75ea729c5164e8c58676e455938eefb473fe11a1c55ff5a: Status 404 returned error can't find the container with id c2827ac1e7d7b863c75ea729c5164e8c58676e455938eefb473fe11a1c55ff5a Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.592808 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a40b-account-create-kpccm"] Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.708002 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jpdl6"] Sep 30 13:54:17 crc kubenswrapper[4763]: W0930 13:54:17.714402 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d37da4a_5377_4d05_93c5_04f933f77894.slice/crio-79a14ef96003f2de28704241c5abfa1bf6396974e455f3f3656dcc85b38f3a86 WatchSource:0}: Error finding container 79a14ef96003f2de28704241c5abfa1bf6396974e455f3f3656dcc85b38f3a86: Status 404 returned error can't find the container with id 79a14ef96003f2de28704241c5abfa1bf6396974e455f3f3656dcc85b38f3a86 Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.864487 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-jllqw"] Sep 30 13:54:17 crc kubenswrapper[4763]: I0930 13:54:17.871855 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-jllqw"] Sep 30 13:54:18 crc kubenswrapper[4763]: I0930 13:54:18.440221 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-97e0-account-create-72gbj" event={"ID":"7033b478-9e40-41ce-9c65-80a3c5c1273f","Type":"ContainerStarted","Data":"29a057f7e3e5df0f8f3a921b57ba3864cd15c6c5a6e6543c79fa63c986c34576"} Sep 30 13:54:18 crc kubenswrapper[4763]: I0930 13:54:18.440471 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-97e0-account-create-72gbj" event={"ID":"7033b478-9e40-41ce-9c65-80a3c5c1273f","Type":"ContainerStarted","Data":"1ba413af5d1b316eda6eef32c48cb2a4727cce3a9495ae4b2d52feb025479a4b"} Sep 30 13:54:18 crc kubenswrapper[4763]: I0930 13:54:18.441350 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpdl6" event={"ID":"9d37da4a-5377-4d05-93c5-04f933f77894","Type":"ContainerStarted","Data":"79a14ef96003f2de28704241c5abfa1bf6396974e455f3f3656dcc85b38f3a86"} Sep 30 13:54:18 crc kubenswrapper[4763]: I0930 13:54:18.442403 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a40b-account-create-kpccm" event={"ID":"9c7985fd-8e51-4b64-8477-bd05c3577312","Type":"ContainerStarted","Data":"f7f2eb05351f5c23b04ef134d546441b3f94f453696ca9e76ea0625928a97114"} Sep 30 13:54:18 crc kubenswrapper[4763]: I0930 13:54:18.444254 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-41ee-account-create-256xb" event={"ID":"1942c414-009b-4326-be52-5cf277802681","Type":"ContainerStarted","Data":"c2827ac1e7d7b863c75ea729c5164e8c58676e455938eefb473fe11a1c55ff5a"} Sep 30 13:54:18 crc kubenswrapper[4763]: I0930 13:54:18.497894 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc62d0c3-527f-414f-ac05-3f788460ed17" path="/var/lib/kubelet/pods/fc62d0c3-527f-414f-ac05-3f788460ed17/volumes" Sep 30 13:54:19 crc kubenswrapper[4763]: I0930 13:54:19.454523 4763 generic.go:334] "Generic (PLEG): container finished" podID="1942c414-009b-4326-be52-5cf277802681" containerID="f1e0864f632bb09307785cf5234b3bb987c83f38ba4012bfdeaef8121a56adfe" exitCode=0 Sep 30 13:54:19 crc kubenswrapper[4763]: I0930 13:54:19.454725 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-41ee-account-create-256xb" event={"ID":"1942c414-009b-4326-be52-5cf277802681","Type":"ContainerDied","Data":"f1e0864f632bb09307785cf5234b3bb987c83f38ba4012bfdeaef8121a56adfe"} Sep 30 13:54:19 crc kubenswrapper[4763]: I0930 13:54:19.456349 4763 generic.go:334] "Generic (PLEG): container finished" podID="7033b478-9e40-41ce-9c65-80a3c5c1273f" containerID="29a057f7e3e5df0f8f3a921b57ba3864cd15c6c5a6e6543c79fa63c986c34576" exitCode=0 Sep 30 13:54:19 crc kubenswrapper[4763]: I0930 13:54:19.456430 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-97e0-account-create-72gbj" event={"ID":"7033b478-9e40-41ce-9c65-80a3c5c1273f","Type":"ContainerDied","Data":"29a057f7e3e5df0f8f3a921b57ba3864cd15c6c5a6e6543c79fa63c986c34576"} Sep 30 13:54:19 crc kubenswrapper[4763]: I0930 13:54:19.458526 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpdl6" event={"ID":"9d37da4a-5377-4d05-93c5-04f933f77894","Type":"ContainerStarted","Data":"5e829dbda104a31e4cd527c5f5bbba0452beccd647814b138c2474881de0df51"} Sep 30 13:54:19 crc kubenswrapper[4763]: I0930 13:54:19.460511 4763 generic.go:334] "Generic (PLEG): container finished" podID="9c7985fd-8e51-4b64-8477-bd05c3577312" containerID="6d69d05182fb0eb1fcd6f0067fd4d61b3fcc60f4c6a09fab437953f0bbf152dc" exitCode=0 Sep 30 13:54:19 crc kubenswrapper[4763]: I0930 13:54:19.460536 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a40b-account-create-kpccm" event={"ID":"9c7985fd-8e51-4b64-8477-bd05c3577312","Type":"ContainerDied","Data":"6d69d05182fb0eb1fcd6f0067fd4d61b3fcc60f4c6a09fab437953f0bbf152dc"} Sep 30 13:54:19 crc kubenswrapper[4763]: I0930 13:54:19.510359 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jpdl6" podStartSLOduration=5.510341662 podStartE2EDuration="5.510341662s" podCreationTimestamp="2025-09-30 13:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:54:19.488046812 +0000 UTC m=+1131.626607097" watchObservedRunningTime="2025-09-30 13:54:19.510341662 +0000 UTC m=+1131.648901947" Sep 30 13:54:20 crc kubenswrapper[4763]: I0930 13:54:20.470903 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1baa8aa7-e856-4478-8662-26f094036b18","Type":"ContainerStarted","Data":"6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1"} Sep 30 13:54:20 crc kubenswrapper[4763]: I0930 13:54:20.881710 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-97e0-account-create-72gbj" Sep 30 13:54:20 crc kubenswrapper[4763]: I0930 13:54:20.932995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs8hh\" (UniqueName: \"kubernetes.io/projected/7033b478-9e40-41ce-9c65-80a3c5c1273f-kube-api-access-xs8hh\") pod \"7033b478-9e40-41ce-9c65-80a3c5c1273f\" (UID: \"7033b478-9e40-41ce-9c65-80a3c5c1273f\") " Sep 30 13:54:20 crc kubenswrapper[4763]: I0930 13:54:20.938990 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7033b478-9e40-41ce-9c65-80a3c5c1273f-kube-api-access-xs8hh" (OuterVolumeSpecName: "kube-api-access-xs8hh") pod "7033b478-9e40-41ce-9c65-80a3c5c1273f" (UID: "7033b478-9e40-41ce-9c65-80a3c5c1273f"). InnerVolumeSpecName "kube-api-access-xs8hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.013822 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a40b-account-create-kpccm" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.020202 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-41ee-account-create-256xb" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.035086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s694d\" (UniqueName: \"kubernetes.io/projected/1942c414-009b-4326-be52-5cf277802681-kube-api-access-s694d\") pod \"1942c414-009b-4326-be52-5cf277802681\" (UID: \"1942c414-009b-4326-be52-5cf277802681\") " Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.035205 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgdjk\" (UniqueName: \"kubernetes.io/projected/9c7985fd-8e51-4b64-8477-bd05c3577312-kube-api-access-pgdjk\") pod \"9c7985fd-8e51-4b64-8477-bd05c3577312\" (UID: \"9c7985fd-8e51-4b64-8477-bd05c3577312\") " Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.035704 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs8hh\" (UniqueName: \"kubernetes.io/projected/7033b478-9e40-41ce-9c65-80a3c5c1273f-kube-api-access-xs8hh\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.039861 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1942c414-009b-4326-be52-5cf277802681-kube-api-access-s694d" (OuterVolumeSpecName: "kube-api-access-s694d") pod "1942c414-009b-4326-be52-5cf277802681" (UID: "1942c414-009b-4326-be52-5cf277802681"). InnerVolumeSpecName "kube-api-access-s694d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.043388 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7985fd-8e51-4b64-8477-bd05c3577312-kube-api-access-pgdjk" (OuterVolumeSpecName: "kube-api-access-pgdjk") pod "9c7985fd-8e51-4b64-8477-bd05c3577312" (UID: "9c7985fd-8e51-4b64-8477-bd05c3577312"). InnerVolumeSpecName "kube-api-access-pgdjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.136662 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgdjk\" (UniqueName: \"kubernetes.io/projected/9c7985fd-8e51-4b64-8477-bd05c3577312-kube-api-access-pgdjk\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.136697 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s694d\" (UniqueName: \"kubernetes.io/projected/1942c414-009b-4326-be52-5cf277802681-kube-api-access-s694d\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.466708 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cfbb96789-jllqw" podUID="fc62d0c3-527f-414f-ac05-3f788460ed17" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.481276 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-41ee-account-create-256xb" event={"ID":"1942c414-009b-4326-be52-5cf277802681","Type":"ContainerDied","Data":"c2827ac1e7d7b863c75ea729c5164e8c58676e455938eefb473fe11a1c55ff5a"} Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.481320 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2827ac1e7d7b863c75ea729c5164e8c58676e455938eefb473fe11a1c55ff5a" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.481378 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-41ee-account-create-256xb" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.486770 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-97e0-account-create-72gbj" event={"ID":"7033b478-9e40-41ce-9c65-80a3c5c1273f","Type":"ContainerDied","Data":"1ba413af5d1b316eda6eef32c48cb2a4727cce3a9495ae4b2d52feb025479a4b"} Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.486820 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ba413af5d1b316eda6eef32c48cb2a4727cce3a9495ae4b2d52feb025479a4b" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.486876 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-97e0-account-create-72gbj" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.496427 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a40b-account-create-kpccm" event={"ID":"9c7985fd-8e51-4b64-8477-bd05c3577312","Type":"ContainerDied","Data":"f7f2eb05351f5c23b04ef134d546441b3f94f453696ca9e76ea0625928a97114"} Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.496470 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7f2eb05351f5c23b04ef134d546441b3f94f453696ca9e76ea0625928a97114" Sep 30 13:54:21 crc kubenswrapper[4763]: I0930 13:54:21.496486 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a40b-account-create-kpccm" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.206554 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-zjgmb"] Sep 30 13:54:22 crc kubenswrapper[4763]: E0930 13:54:22.206928 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1942c414-009b-4326-be52-5cf277802681" containerName="mariadb-account-create" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.206945 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1942c414-009b-4326-be52-5cf277802681" containerName="mariadb-account-create" Sep 30 13:54:22 crc kubenswrapper[4763]: E0930 13:54:22.206962 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc62d0c3-527f-414f-ac05-3f788460ed17" containerName="init" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.206969 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc62d0c3-527f-414f-ac05-3f788460ed17" containerName="init" Sep 30 13:54:22 crc kubenswrapper[4763]: E0930 13:54:22.206979 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7033b478-9e40-41ce-9c65-80a3c5c1273f" containerName="mariadb-account-create" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.206985 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7033b478-9e40-41ce-9c65-80a3c5c1273f" containerName="mariadb-account-create" Sep 30 13:54:22 crc kubenswrapper[4763]: E0930 13:54:22.206996 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7985fd-8e51-4b64-8477-bd05c3577312" containerName="mariadb-account-create" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.207002 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7985fd-8e51-4b64-8477-bd05c3577312" containerName="mariadb-account-create" Sep 30 13:54:22 crc kubenswrapper[4763]: E0930 13:54:22.207022 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc62d0c3-527f-414f-ac05-3f788460ed17" containerName="dnsmasq-dns" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.207028 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc62d0c3-527f-414f-ac05-3f788460ed17" containerName="dnsmasq-dns" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.207177 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7033b478-9e40-41ce-9c65-80a3c5c1273f" containerName="mariadb-account-create" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.207190 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc62d0c3-527f-414f-ac05-3f788460ed17" containerName="dnsmasq-dns" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.207198 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1942c414-009b-4326-be52-5cf277802681" containerName="mariadb-account-create" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.207218 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c7985fd-8e51-4b64-8477-bd05c3577312" containerName="mariadb-account-create" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.207714 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.210202 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nwkzr" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.213016 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.213155 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.217397 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zjgmb"] Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.257065 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e000c274-a7a0-493f-a0ea-537e5c474cb0-etc-machine-id\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.257140 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-combined-ca-bundle\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.257180 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2r6\" (UniqueName: \"kubernetes.io/projected/e000c274-a7a0-493f-a0ea-537e5c474cb0-kube-api-access-hv2r6\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.257216 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-config-data\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.257313 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-scripts\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.257391 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-db-sync-config-data\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.358957 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-db-sync-config-data\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.359050 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e000c274-a7a0-493f-a0ea-537e5c474cb0-etc-machine-id\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.359086 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-combined-ca-bundle\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.359114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2r6\" (UniqueName: \"kubernetes.io/projected/e000c274-a7a0-493f-a0ea-537e5c474cb0-kube-api-access-hv2r6\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.359141 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-config-data\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.359185 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-scripts\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.360417 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e000c274-a7a0-493f-a0ea-537e5c474cb0-etc-machine-id\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.366641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-combined-ca-bundle\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.367020 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-scripts\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.368181 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-db-sync-config-data\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.387712 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-config-data\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.389688 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2r6\" (UniqueName: \"kubernetes.io/projected/e000c274-a7a0-493f-a0ea-537e5c474cb0-kube-api-access-hv2r6\") pod \"cinder-db-sync-zjgmb\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.448858 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-xmvws"] Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.450047 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.452845 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.453140 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.462119 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bf5w6" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.466970 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-xmvws"] Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.533004 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.565105 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgmxt\" (UniqueName: \"kubernetes.io/projected/96adbfe1-e6f8-4460-b999-a213cb396c4b-kube-api-access-jgmxt\") pod \"neutron-db-sync-xmvws\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.565421 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-config\") pod \"neutron-db-sync-xmvws\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.565476 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-combined-ca-bundle\") pod \"neutron-db-sync-xmvws\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.666632 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgmxt\" (UniqueName: \"kubernetes.io/projected/96adbfe1-e6f8-4460-b999-a213cb396c4b-kube-api-access-jgmxt\") pod \"neutron-db-sync-xmvws\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.666751 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-config\") pod \"neutron-db-sync-xmvws\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.666833 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-combined-ca-bundle\") pod \"neutron-db-sync-xmvws\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.671483 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-combined-ca-bundle\") pod \"neutron-db-sync-xmvws\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.672132 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-config\") pod \"neutron-db-sync-xmvws\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.686344 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgmxt\" (UniqueName: \"kubernetes.io/projected/96adbfe1-e6f8-4460-b999-a213cb396c4b-kube-api-access-jgmxt\") pod \"neutron-db-sync-xmvws\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:22 crc kubenswrapper[4763]: I0930 13:54:22.778302 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xmvws" Sep 30 13:54:24 crc kubenswrapper[4763]: I0930 13:54:24.809051 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zjgmb"] Sep 30 13:54:25 crc kubenswrapper[4763]: I0930 13:54:25.067859 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-xmvws"] Sep 30 13:54:25 crc kubenswrapper[4763]: W0930 13:54:25.073228 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96adbfe1_e6f8_4460_b999_a213cb396c4b.slice/crio-78b6941f5e8bb3b06e8fbfcbabf262a7ed631124053f1081a46c4b56fe51e8a4 WatchSource:0}: Error finding container 78b6941f5e8bb3b06e8fbfcbabf262a7ed631124053f1081a46c4b56fe51e8a4: Status 404 returned error can't find the container with id 78b6941f5e8bb3b06e8fbfcbabf262a7ed631124053f1081a46c4b56fe51e8a4 Sep 30 13:54:25 crc kubenswrapper[4763]: I0930 13:54:25.546091 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjgmb" event={"ID":"e000c274-a7a0-493f-a0ea-537e5c474cb0","Type":"ContainerStarted","Data":"f2e3d805f91ff3eb424ecae60705132d2fb21a2b0a675c59cf1cb2ef1316f545"} Sep 30 13:54:25 crc kubenswrapper[4763]: I0930 13:54:25.547194 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xmvws" event={"ID":"96adbfe1-e6f8-4460-b999-a213cb396c4b","Type":"ContainerStarted","Data":"78b6941f5e8bb3b06e8fbfcbabf262a7ed631124053f1081a46c4b56fe51e8a4"} Sep 30 13:54:26 crc kubenswrapper[4763]: I0930 13:54:26.558445 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xmvws" event={"ID":"96adbfe1-e6f8-4460-b999-a213cb396c4b","Type":"ContainerStarted","Data":"0bd45db8372dd13c7f8b6af5a8f009d51c1365ea9b4c411b536ad6bd5ed5a9de"} Sep 30 13:54:26 crc kubenswrapper[4763]: I0930 13:54:26.560771 4763 generic.go:334] "Generic (PLEG): container finished" podID="9d37da4a-5377-4d05-93c5-04f933f77894" containerID="5e829dbda104a31e4cd527c5f5bbba0452beccd647814b138c2474881de0df51" exitCode=0 Sep 30 13:54:26 crc kubenswrapper[4763]: I0930 13:54:26.560823 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpdl6" event={"ID":"9d37da4a-5377-4d05-93c5-04f933f77894","Type":"ContainerDied","Data":"5e829dbda104a31e4cd527c5f5bbba0452beccd647814b138c2474881de0df51"} Sep 30 13:54:26 crc kubenswrapper[4763]: I0930 13:54:26.608007 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-xmvws" podStartSLOduration=4.607987238 podStartE2EDuration="4.607987238s" podCreationTimestamp="2025-09-30 13:54:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:54:26.584839947 +0000 UTC m=+1138.723400232" watchObservedRunningTime="2025-09-30 13:54:26.607987238 +0000 UTC m=+1138.746547523" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.071375 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-nps2w"] Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.072370 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.074974 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.075987 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w4hgl" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.086982 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nps2w"] Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.142141 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-db-sync-config-data\") pod \"barbican-db-sync-nps2w\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.142214 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-combined-ca-bundle\") pod \"barbican-db-sync-nps2w\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.142286 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk86j\" (UniqueName: \"kubernetes.io/projected/cbda58bd-f991-4d2d-ba12-f3945505afa6-kube-api-access-dk86j\") pod \"barbican-db-sync-nps2w\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.243830 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-db-sync-config-data\") pod \"barbican-db-sync-nps2w\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.243917 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-combined-ca-bundle\") pod \"barbican-db-sync-nps2w\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.244004 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk86j\" (UniqueName: \"kubernetes.io/projected/cbda58bd-f991-4d2d-ba12-f3945505afa6-kube-api-access-dk86j\") pod \"barbican-db-sync-nps2w\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.253249 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-combined-ca-bundle\") pod \"barbican-db-sync-nps2w\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.257079 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-db-sync-config-data\") pod \"barbican-db-sync-nps2w\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.265293 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk86j\" (UniqueName: \"kubernetes.io/projected/cbda58bd-f991-4d2d-ba12-f3945505afa6-kube-api-access-dk86j\") pod \"barbican-db-sync-nps2w\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:27 crc kubenswrapper[4763]: I0930 13:54:27.392033 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nps2w" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.575326 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.603042 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpdl6" event={"ID":"9d37da4a-5377-4d05-93c5-04f933f77894","Type":"ContainerDied","Data":"79a14ef96003f2de28704241c5abfa1bf6396974e455f3f3656dcc85b38f3a86"} Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.603099 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79a14ef96003f2de28704241c5abfa1bf6396974e455f3f3656dcc85b38f3a86" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.603144 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpdl6" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.694916 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-config-data\") pod \"9d37da4a-5377-4d05-93c5-04f933f77894\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.694969 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-credential-keys\") pod \"9d37da4a-5377-4d05-93c5-04f933f77894\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.695028 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-combined-ca-bundle\") pod \"9d37da4a-5377-4d05-93c5-04f933f77894\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.695171 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxtvv\" (UniqueName: \"kubernetes.io/projected/9d37da4a-5377-4d05-93c5-04f933f77894-kube-api-access-qxtvv\") pod \"9d37da4a-5377-4d05-93c5-04f933f77894\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.695204 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-fernet-keys\") pod \"9d37da4a-5377-4d05-93c5-04f933f77894\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.695234 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-scripts\") pod \"9d37da4a-5377-4d05-93c5-04f933f77894\" (UID: \"9d37da4a-5377-4d05-93c5-04f933f77894\") " Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.714557 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d37da4a-5377-4d05-93c5-04f933f77894-kube-api-access-qxtvv" (OuterVolumeSpecName: "kube-api-access-qxtvv") pod "9d37da4a-5377-4d05-93c5-04f933f77894" (UID: "9d37da4a-5377-4d05-93c5-04f933f77894"). InnerVolumeSpecName "kube-api-access-qxtvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.737893 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-scripts" (OuterVolumeSpecName: "scripts") pod "9d37da4a-5377-4d05-93c5-04f933f77894" (UID: "9d37da4a-5377-4d05-93c5-04f933f77894"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.741870 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9d37da4a-5377-4d05-93c5-04f933f77894" (UID: "9d37da4a-5377-4d05-93c5-04f933f77894"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.744740 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9d37da4a-5377-4d05-93c5-04f933f77894" (UID: "9d37da4a-5377-4d05-93c5-04f933f77894"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.801383 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxtvv\" (UniqueName: \"kubernetes.io/projected/9d37da4a-5377-4d05-93c5-04f933f77894-kube-api-access-qxtvv\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.801414 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.801425 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.801434 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.866120 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5775d899cd-b25ch"] Sep 30 13:54:28 crc kubenswrapper[4763]: E0930 13:54:28.866520 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d37da4a-5377-4d05-93c5-04f933f77894" containerName="keystone-bootstrap" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.866536 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d37da4a-5377-4d05-93c5-04f933f77894" containerName="keystone-bootstrap" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.866775 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d37da4a-5377-4d05-93c5-04f933f77894" containerName="keystone-bootstrap" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.867282 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.872274 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.873892 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.883902 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d37da4a-5377-4d05-93c5-04f933f77894" (UID: "9d37da4a-5377-4d05-93c5-04f933f77894"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.886800 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-config-data" (OuterVolumeSpecName: "config-data") pod "9d37da4a-5377-4d05-93c5-04f933f77894" (UID: "9d37da4a-5377-4d05-93c5-04f933f77894"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.907691 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-credential-keys\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.907767 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-scripts\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.907813 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-combined-ca-bundle\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.907857 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-public-tls-certs\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.907882 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-fernet-keys\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.907900 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-config-data\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.907922 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-internal-tls-certs\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.907985 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9jk9\" (UniqueName: \"kubernetes.io/projected/0cf247fc-bc61-4305-b8a5-19ac60eba62a-kube-api-access-j9jk9\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.908041 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.908053 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d37da4a-5377-4d05-93c5-04f933f77894-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:28 crc kubenswrapper[4763]: I0930 13:54:28.912668 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5775d899cd-b25ch"] Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.019821 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-combined-ca-bundle\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.020147 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-public-tls-certs\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.020172 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-fernet-keys\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.020190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-config-data\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.020208 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-internal-tls-certs\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.020245 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9jk9\" (UniqueName: \"kubernetes.io/projected/0cf247fc-bc61-4305-b8a5-19ac60eba62a-kube-api-access-j9jk9\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.020292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-credential-keys\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.020325 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-scripts\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.024272 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-scripts\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.028060 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-fernet-keys\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.031284 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-internal-tls-certs\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.033375 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-credential-keys\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.033425 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-config-data\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.033735 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-public-tls-certs\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.033921 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-combined-ca-bundle\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.041866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9jk9\" (UniqueName: \"kubernetes.io/projected/0cf247fc-bc61-4305-b8a5-19ac60eba62a-kube-api-access-j9jk9\") pod \"keystone-5775d899cd-b25ch\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.130336 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nps2w"] Sep 30 13:54:29 crc kubenswrapper[4763]: W0930 13:54:29.155118 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbda58bd_f991_4d2d_ba12_f3945505afa6.slice/crio-92df2ac8d5d4483dceb93330190281bb0e834e8d495f3527f99880e8ac2b6118 WatchSource:0}: Error finding container 92df2ac8d5d4483dceb93330190281bb0e834e8d495f3527f99880e8ac2b6118: Status 404 returned error can't find the container with id 92df2ac8d5d4483dceb93330190281bb0e834e8d495f3527f99880e8ac2b6118 Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.184400 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.612843 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nps2w" event={"ID":"cbda58bd-f991-4d2d-ba12-f3945505afa6","Type":"ContainerStarted","Data":"92df2ac8d5d4483dceb93330190281bb0e834e8d495f3527f99880e8ac2b6118"} Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.615578 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1baa8aa7-e856-4478-8662-26f094036b18","Type":"ContainerStarted","Data":"49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995"} Sep 30 13:54:29 crc kubenswrapper[4763]: I0930 13:54:29.635319 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5775d899cd-b25ch"] Sep 30 13:54:29 crc kubenswrapper[4763]: W0930 13:54:29.645127 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cf247fc_bc61_4305_b8a5_19ac60eba62a.slice/crio-ad395800e989605b6f3a8e35f1d2619b2d7eddf6583445bd471a7c51aab1d6ae WatchSource:0}: Error finding container ad395800e989605b6f3a8e35f1d2619b2d7eddf6583445bd471a7c51aab1d6ae: Status 404 returned error can't find the container with id ad395800e989605b6f3a8e35f1d2619b2d7eddf6583445bd471a7c51aab1d6ae Sep 30 13:54:30 crc kubenswrapper[4763]: I0930 13:54:30.628381 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5775d899cd-b25ch" event={"ID":"0cf247fc-bc61-4305-b8a5-19ac60eba62a","Type":"ContainerStarted","Data":"ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4"} Sep 30 13:54:30 crc kubenswrapper[4763]: I0930 13:54:30.629340 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5775d899cd-b25ch" event={"ID":"0cf247fc-bc61-4305-b8a5-19ac60eba62a","Type":"ContainerStarted","Data":"ad395800e989605b6f3a8e35f1d2619b2d7eddf6583445bd471a7c51aab1d6ae"} Sep 30 13:54:30 crc kubenswrapper[4763]: I0930 13:54:30.629395 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:54:30 crc kubenswrapper[4763]: I0930 13:54:30.633033 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-r6p2n" event={"ID":"56447315-e00d-4a65-9ee4-c58432d2ebca","Type":"ContainerStarted","Data":"5529dafb50ba26bbc36ba32edb787859ab716d780082a8b3e8be2e416c1a2e80"} Sep 30 13:54:30 crc kubenswrapper[4763]: I0930 13:54:30.634978 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-q6kdz" event={"ID":"7c3f0264-cce9-436f-923d-79f807488437","Type":"ContainerStarted","Data":"d5c0dbee3becae192bb8e52217ee73cb5863f82428aefff98191c654a4fd0735"} Sep 30 13:54:30 crc kubenswrapper[4763]: I0930 13:54:30.657998 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5775d899cd-b25ch" podStartSLOduration=2.657981419 podStartE2EDuration="2.657981419s" podCreationTimestamp="2025-09-30 13:54:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:54:30.650962173 +0000 UTC m=+1142.789522478" watchObservedRunningTime="2025-09-30 13:54:30.657981419 +0000 UTC m=+1142.796541704" Sep 30 13:54:30 crc kubenswrapper[4763]: I0930 13:54:30.671502 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-q6kdz" podStartSLOduration=5.195507813 podStartE2EDuration="40.671483617s" podCreationTimestamp="2025-09-30 13:53:50 +0000 UTC" firstStartedPulling="2025-09-30 13:53:53.173812286 +0000 UTC m=+1105.312372571" lastFinishedPulling="2025-09-30 13:54:28.64978809 +0000 UTC m=+1140.788348375" observedRunningTime="2025-09-30 13:54:30.665279222 +0000 UTC m=+1142.803839497" watchObservedRunningTime="2025-09-30 13:54:30.671483617 +0000 UTC m=+1142.810043902" Sep 30 13:54:31 crc kubenswrapper[4763]: I0930 13:54:31.665027 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-r6p2n" podStartSLOduration=3.137429166 podStartE2EDuration="33.665002832s" podCreationTimestamp="2025-09-30 13:53:58 +0000 UTC" firstStartedPulling="2025-09-30 13:53:59.73143982 +0000 UTC m=+1111.870000105" lastFinishedPulling="2025-09-30 13:54:30.259013486 +0000 UTC m=+1142.397573771" observedRunningTime="2025-09-30 13:54:31.661583785 +0000 UTC m=+1143.800144080" watchObservedRunningTime="2025-09-30 13:54:31.665002832 +0000 UTC m=+1143.803563117" Sep 30 13:54:35 crc kubenswrapper[4763]: I0930 13:54:35.688318 4763 generic.go:334] "Generic (PLEG): container finished" podID="56447315-e00d-4a65-9ee4-c58432d2ebca" containerID="5529dafb50ba26bbc36ba32edb787859ab716d780082a8b3e8be2e416c1a2e80" exitCode=0 Sep 30 13:54:35 crc kubenswrapper[4763]: I0930 13:54:35.688455 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-r6p2n" event={"ID":"56447315-e00d-4a65-9ee4-c58432d2ebca","Type":"ContainerDied","Data":"5529dafb50ba26bbc36ba32edb787859ab716d780082a8b3e8be2e416c1a2e80"} Sep 30 13:54:39 crc kubenswrapper[4763]: I0930 13:54:39.949267 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-r6p2n" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.049765 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d9mk\" (UniqueName: \"kubernetes.io/projected/56447315-e00d-4a65-9ee4-c58432d2ebca-kube-api-access-5d9mk\") pod \"56447315-e00d-4a65-9ee4-c58432d2ebca\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.049870 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56447315-e00d-4a65-9ee4-c58432d2ebca-logs\") pod \"56447315-e00d-4a65-9ee4-c58432d2ebca\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.050010 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-scripts\") pod \"56447315-e00d-4a65-9ee4-c58432d2ebca\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.050108 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-config-data\") pod \"56447315-e00d-4a65-9ee4-c58432d2ebca\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.050149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-combined-ca-bundle\") pod \"56447315-e00d-4a65-9ee4-c58432d2ebca\" (UID: \"56447315-e00d-4a65-9ee4-c58432d2ebca\") " Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.051405 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56447315-e00d-4a65-9ee4-c58432d2ebca-logs" (OuterVolumeSpecName: "logs") pod "56447315-e00d-4a65-9ee4-c58432d2ebca" (UID: "56447315-e00d-4a65-9ee4-c58432d2ebca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.057665 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-scripts" (OuterVolumeSpecName: "scripts") pod "56447315-e00d-4a65-9ee4-c58432d2ebca" (UID: "56447315-e00d-4a65-9ee4-c58432d2ebca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.062939 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56447315-e00d-4a65-9ee4-c58432d2ebca-kube-api-access-5d9mk" (OuterVolumeSpecName: "kube-api-access-5d9mk") pod "56447315-e00d-4a65-9ee4-c58432d2ebca" (UID: "56447315-e00d-4a65-9ee4-c58432d2ebca"). InnerVolumeSpecName "kube-api-access-5d9mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.075018 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-config-data" (OuterVolumeSpecName: "config-data") pod "56447315-e00d-4a65-9ee4-c58432d2ebca" (UID: "56447315-e00d-4a65-9ee4-c58432d2ebca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.079070 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56447315-e00d-4a65-9ee4-c58432d2ebca" (UID: "56447315-e00d-4a65-9ee4-c58432d2ebca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.151928 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.151958 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.151970 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56447315-e00d-4a65-9ee4-c58432d2ebca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.151983 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d9mk\" (UniqueName: \"kubernetes.io/projected/56447315-e00d-4a65-9ee4-c58432d2ebca-kube-api-access-5d9mk\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.151995 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56447315-e00d-4a65-9ee4-c58432d2ebca-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.740391 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-r6p2n" event={"ID":"56447315-e00d-4a65-9ee4-c58432d2ebca","Type":"ContainerDied","Data":"7e48614cdf4555f5669e703f01b52277f13bb979016a8d05aa9b3543a6e0800c"} Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.740767 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e48614cdf4555f5669e703f01b52277f13bb979016a8d05aa9b3543a6e0800c" Sep 30 13:54:40 crc kubenswrapper[4763]: I0930 13:54:40.740525 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-r6p2n" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.048690 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5b87bfdd4b-tbjxc"] Sep 30 13:54:41 crc kubenswrapper[4763]: E0930 13:54:41.049119 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56447315-e00d-4a65-9ee4-c58432d2ebca" containerName="placement-db-sync" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.049133 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="56447315-e00d-4a65-9ee4-c58432d2ebca" containerName="placement-db-sync" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.049352 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="56447315-e00d-4a65-9ee4-c58432d2ebca" containerName="placement-db-sync" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.050442 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.053630 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8m22d" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.054057 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.054240 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.054399 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.054553 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.065930 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b87bfdd4b-tbjxc"] Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.168365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4crl\" (UniqueName: \"kubernetes.io/projected/15fbd312-35ac-4e62-ad60-ffccf94eab4a-kube-api-access-g4crl\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.168473 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-scripts\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.168651 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-combined-ca-bundle\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.168708 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-internal-tls-certs\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.168769 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-config-data\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.168795 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fbd312-35ac-4e62-ad60-ffccf94eab4a-logs\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.169019 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-public-tls-certs\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.271150 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-config-data\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.271204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fbd312-35ac-4e62-ad60-ffccf94eab4a-logs\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.271306 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-public-tls-certs\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.271335 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4crl\" (UniqueName: \"kubernetes.io/projected/15fbd312-35ac-4e62-ad60-ffccf94eab4a-kube-api-access-g4crl\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.271388 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-scripts\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.271424 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-combined-ca-bundle\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.271450 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-internal-tls-certs\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.271912 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fbd312-35ac-4e62-ad60-ffccf94eab4a-logs\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.275883 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-internal-tls-certs\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.277488 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-config-data\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.277754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-scripts\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.278033 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-public-tls-certs\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.279126 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-combined-ca-bundle\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.293471 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4crl\" (UniqueName: \"kubernetes.io/projected/15fbd312-35ac-4e62-ad60-ffccf94eab4a-kube-api-access-g4crl\") pod \"placement-5b87bfdd4b-tbjxc\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:41 crc kubenswrapper[4763]: I0930 13:54:41.385350 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:56 crc kubenswrapper[4763]: E0930 13:54:56.183511 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695" Sep 30 13:54:56 crc kubenswrapper[4763]: E0930 13:54:56.184260 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hv2r6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-zjgmb_openstack(e000c274-a7a0-493f-a0ea-537e5c474cb0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:54:56 crc kubenswrapper[4763]: E0930 13:54:56.185768 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-zjgmb" podUID="e000c274-a7a0-493f-a0ea-537e5c474cb0" Sep 30 13:54:56 crc kubenswrapper[4763]: E0930 13:54:56.846151 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48" Sep 30 13:54:56 crc kubenswrapper[4763]: E0930 13:54:56.846987 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tjqgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1baa8aa7-e856-4478-8662-26f094036b18): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:54:56 crc kubenswrapper[4763]: E0930 13:54:56.848264 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="1baa8aa7-e856-4478-8662-26f094036b18" Sep 30 13:54:56 crc kubenswrapper[4763]: I0930 13:54:56.886152 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1baa8aa7-e856-4478-8662-26f094036b18" containerName="ceilometer-notification-agent" containerID="cri-o://6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1" gracePeriod=30 Sep 30 13:54:56 crc kubenswrapper[4763]: I0930 13:54:56.886311 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1baa8aa7-e856-4478-8662-26f094036b18" containerName="sg-core" containerID="cri-o://49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995" gracePeriod=30 Sep 30 13:54:56 crc kubenswrapper[4763]: E0930 13:54:56.889891 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695\\\"\"" pod="openstack/cinder-db-sync-zjgmb" podUID="e000c274-a7a0-493f-a0ea-537e5c474cb0" Sep 30 13:54:57 crc kubenswrapper[4763]: I0930 13:54:57.292726 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b87bfdd4b-tbjxc"] Sep 30 13:54:57 crc kubenswrapper[4763]: I0930 13:54:57.895050 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b87bfdd4b-tbjxc" event={"ID":"15fbd312-35ac-4e62-ad60-ffccf94eab4a","Type":"ContainerStarted","Data":"0c4968490ce8a08e1aec5c2072537900212ab6566f70cf01898816dd71f1b15c"} Sep 30 13:54:57 crc kubenswrapper[4763]: I0930 13:54:57.895101 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b87bfdd4b-tbjxc" event={"ID":"15fbd312-35ac-4e62-ad60-ffccf94eab4a","Type":"ContainerStarted","Data":"406b783e23b8c4c640b95b39ff2f63415a30004ec013d7d1ccc9d85eec9a71a8"} Sep 30 13:54:57 crc kubenswrapper[4763]: I0930 13:54:57.897655 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nps2w" event={"ID":"cbda58bd-f991-4d2d-ba12-f3945505afa6","Type":"ContainerStarted","Data":"6c1683e0a49795ca53dea060320152270eba1724911d0166bb8dcca337c5c33b"} Sep 30 13:54:57 crc kubenswrapper[4763]: I0930 13:54:57.902892 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1baa8aa7-e856-4478-8662-26f094036b18","Type":"ContainerDied","Data":"49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995"} Sep 30 13:54:57 crc kubenswrapper[4763]: I0930 13:54:57.902842 4763 generic.go:334] "Generic (PLEG): container finished" podID="1baa8aa7-e856-4478-8662-26f094036b18" containerID="49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995" exitCode=2 Sep 30 13:54:57 crc kubenswrapper[4763]: I0930 13:54:57.916029 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-nps2w" podStartSLOduration=3.290715022 podStartE2EDuration="30.91600809s" podCreationTimestamp="2025-09-30 13:54:27 +0000 UTC" firstStartedPulling="2025-09-30 13:54:29.158898247 +0000 UTC m=+1141.297458532" lastFinishedPulling="2025-09-30 13:54:56.784191275 +0000 UTC m=+1168.922751600" observedRunningTime="2025-09-30 13:54:57.913493307 +0000 UTC m=+1170.052053592" watchObservedRunningTime="2025-09-30 13:54:57.91600809 +0000 UTC m=+1170.054568375" Sep 30 13:54:58 crc kubenswrapper[4763]: I0930 13:54:58.943811 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b87bfdd4b-tbjxc" event={"ID":"15fbd312-35ac-4e62-ad60-ffccf94eab4a","Type":"ContainerStarted","Data":"6e00eb474337eb85a3ae6ce678a0a8afddc2bad42ef7bdbf41de0b427ce3b086"} Sep 30 13:54:58 crc kubenswrapper[4763]: I0930 13:54:58.944226 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:58 crc kubenswrapper[4763]: I0930 13:54:58.944266 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:54:58 crc kubenswrapper[4763]: I0930 13:54:58.976119 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5b87bfdd4b-tbjxc" podStartSLOduration=17.976099035 podStartE2EDuration="17.976099035s" podCreationTimestamp="2025-09-30 13:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:54:58.97191881 +0000 UTC m=+1171.110479095" watchObservedRunningTime="2025-09-30 13:54:58.976099035 +0000 UTC m=+1171.114659320" Sep 30 13:55:00 crc kubenswrapper[4763]: I0930 13:55:00.726299 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:55:01 crc kubenswrapper[4763]: I0930 13:55:01.986403 4763 generic.go:334] "Generic (PLEG): container finished" podID="cbda58bd-f991-4d2d-ba12-f3945505afa6" containerID="6c1683e0a49795ca53dea060320152270eba1724911d0166bb8dcca337c5c33b" exitCode=0 Sep 30 13:55:01 crc kubenswrapper[4763]: I0930 13:55:01.986815 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nps2w" event={"ID":"cbda58bd-f991-4d2d-ba12-f3945505afa6","Type":"ContainerDied","Data":"6c1683e0a49795ca53dea060320152270eba1724911d0166bb8dcca337c5c33b"} Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.405520 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nps2w" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.476807 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-combined-ca-bundle\") pod \"cbda58bd-f991-4d2d-ba12-f3945505afa6\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.477004 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk86j\" (UniqueName: \"kubernetes.io/projected/cbda58bd-f991-4d2d-ba12-f3945505afa6-kube-api-access-dk86j\") pod \"cbda58bd-f991-4d2d-ba12-f3945505afa6\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.477156 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-db-sync-config-data\") pod \"cbda58bd-f991-4d2d-ba12-f3945505afa6\" (UID: \"cbda58bd-f991-4d2d-ba12-f3945505afa6\") " Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.485330 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cbda58bd-f991-4d2d-ba12-f3945505afa6" (UID: "cbda58bd-f991-4d2d-ba12-f3945505afa6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.485748 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbda58bd-f991-4d2d-ba12-f3945505afa6-kube-api-access-dk86j" (OuterVolumeSpecName: "kube-api-access-dk86j") pod "cbda58bd-f991-4d2d-ba12-f3945505afa6" (UID: "cbda58bd-f991-4d2d-ba12-f3945505afa6"). InnerVolumeSpecName "kube-api-access-dk86j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.509311 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbda58bd-f991-4d2d-ba12-f3945505afa6" (UID: "cbda58bd-f991-4d2d-ba12-f3945505afa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.537686 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.578710 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-run-httpd\") pod \"1baa8aa7-e856-4478-8662-26f094036b18\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.578832 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-log-httpd\") pod \"1baa8aa7-e856-4478-8662-26f094036b18\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.578866 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-sg-core-conf-yaml\") pod \"1baa8aa7-e856-4478-8662-26f094036b18\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.578906 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-combined-ca-bundle\") pod \"1baa8aa7-e856-4478-8662-26f094036b18\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.579222 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjqgm\" (UniqueName: \"kubernetes.io/projected/1baa8aa7-e856-4478-8662-26f094036b18-kube-api-access-tjqgm\") pod \"1baa8aa7-e856-4478-8662-26f094036b18\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.579258 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-scripts\") pod \"1baa8aa7-e856-4478-8662-26f094036b18\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.579280 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-config-data\") pod \"1baa8aa7-e856-4478-8662-26f094036b18\" (UID: \"1baa8aa7-e856-4478-8662-26f094036b18\") " Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.579278 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1baa8aa7-e856-4478-8662-26f094036b18" (UID: "1baa8aa7-e856-4478-8662-26f094036b18"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.579330 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1baa8aa7-e856-4478-8662-26f094036b18" (UID: "1baa8aa7-e856-4478-8662-26f094036b18"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.580024 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.580050 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.580072 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk86j\" (UniqueName: \"kubernetes.io/projected/cbda58bd-f991-4d2d-ba12-f3945505afa6-kube-api-access-dk86j\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.580085 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1baa8aa7-e856-4478-8662-26f094036b18-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.580098 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cbda58bd-f991-4d2d-ba12-f3945505afa6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.583220 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1baa8aa7-e856-4478-8662-26f094036b18-kube-api-access-tjqgm" (OuterVolumeSpecName: "kube-api-access-tjqgm") pod "1baa8aa7-e856-4478-8662-26f094036b18" (UID: "1baa8aa7-e856-4478-8662-26f094036b18"). InnerVolumeSpecName "kube-api-access-tjqgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.589664 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-scripts" (OuterVolumeSpecName: "scripts") pod "1baa8aa7-e856-4478-8662-26f094036b18" (UID: "1baa8aa7-e856-4478-8662-26f094036b18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.603497 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1baa8aa7-e856-4478-8662-26f094036b18" (UID: "1baa8aa7-e856-4478-8662-26f094036b18"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.608505 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1baa8aa7-e856-4478-8662-26f094036b18" (UID: "1baa8aa7-e856-4478-8662-26f094036b18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.617845 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-config-data" (OuterVolumeSpecName: "config-data") pod "1baa8aa7-e856-4478-8662-26f094036b18" (UID: "1baa8aa7-e856-4478-8662-26f094036b18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.681535 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.681569 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.681578 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjqgm\" (UniqueName: \"kubernetes.io/projected/1baa8aa7-e856-4478-8662-26f094036b18-kube-api-access-tjqgm\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.681591 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:03 crc kubenswrapper[4763]: I0930 13:55:03.681611 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1baa8aa7-e856-4478-8662-26f094036b18-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.003657 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nps2w" event={"ID":"cbda58bd-f991-4d2d-ba12-f3945505afa6","Type":"ContainerDied","Data":"92df2ac8d5d4483dceb93330190281bb0e834e8d495f3527f99880e8ac2b6118"} Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.004277 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92df2ac8d5d4483dceb93330190281bb0e834e8d495f3527f99880e8ac2b6118" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.004417 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nps2w" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.017369 4763 generic.go:334] "Generic (PLEG): container finished" podID="1baa8aa7-e856-4478-8662-26f094036b18" containerID="6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1" exitCode=0 Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.017437 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1baa8aa7-e856-4478-8662-26f094036b18","Type":"ContainerDied","Data":"6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1"} Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.017455 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.019190 4763 scope.go:117] "RemoveContainer" containerID="49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.019163 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1baa8aa7-e856-4478-8662-26f094036b18","Type":"ContainerDied","Data":"b2b7228adf60393419d81c71e98349fdeea84d300b80076976e5d9d722d2bf8f"} Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.021573 4763 generic.go:334] "Generic (PLEG): container finished" podID="7c3f0264-cce9-436f-923d-79f807488437" containerID="d5c0dbee3becae192bb8e52217ee73cb5863f82428aefff98191c654a4fd0735" exitCode=0 Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.021632 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-q6kdz" event={"ID":"7c3f0264-cce9-436f-923d-79f807488437","Type":"ContainerDied","Data":"d5c0dbee3becae192bb8e52217ee73cb5863f82428aefff98191c654a4fd0735"} Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.048099 4763 scope.go:117] "RemoveContainer" containerID="6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.075406 4763 scope.go:117] "RemoveContainer" containerID="49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995" Sep 30 13:55:04 crc kubenswrapper[4763]: E0930 13:55:04.075918 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995\": container with ID starting with 49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995 not found: ID does not exist" containerID="49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.075966 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995"} err="failed to get container status \"49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995\": rpc error: code = NotFound desc = could not find container \"49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995\": container with ID starting with 49786bda1db23598d89e0b3a41896b8a3c26602f07a0084ffb7f80990e7cf995 not found: ID does not exist" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.075987 4763 scope.go:117] "RemoveContainer" containerID="6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1" Sep 30 13:55:04 crc kubenswrapper[4763]: E0930 13:55:04.076227 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1\": container with ID starting with 6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1 not found: ID does not exist" containerID="6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.076245 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1"} err="failed to get container status \"6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1\": rpc error: code = NotFound desc = could not find container \"6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1\": container with ID starting with 6100afe18001a68ffff7b0f1d1650650f7d6fa70b9caa02c59e534402b89a8c1 not found: ID does not exist" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.098647 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.110176 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.156719 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:04 crc kubenswrapper[4763]: E0930 13:55:04.157189 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbda58bd-f991-4d2d-ba12-f3945505afa6" containerName="barbican-db-sync" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.157206 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbda58bd-f991-4d2d-ba12-f3945505afa6" containerName="barbican-db-sync" Sep 30 13:55:04 crc kubenswrapper[4763]: E0930 13:55:04.157218 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1baa8aa7-e856-4478-8662-26f094036b18" containerName="ceilometer-notification-agent" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.157226 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1baa8aa7-e856-4478-8662-26f094036b18" containerName="ceilometer-notification-agent" Sep 30 13:55:04 crc kubenswrapper[4763]: E0930 13:55:04.157268 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1baa8aa7-e856-4478-8662-26f094036b18" containerName="sg-core" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.157276 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1baa8aa7-e856-4478-8662-26f094036b18" containerName="sg-core" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.157463 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1baa8aa7-e856-4478-8662-26f094036b18" containerName="ceilometer-notification-agent" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.157473 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbda58bd-f991-4d2d-ba12-f3945505afa6" containerName="barbican-db-sync" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.157493 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1baa8aa7-e856-4478-8662-26f094036b18" containerName="sg-core" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.159407 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.162448 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.162669 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.169346 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.199740 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.199886 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-scripts\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.199951 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-run-httpd\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.200016 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-log-httpd\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.200069 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbh96\" (UniqueName: \"kubernetes.io/projected/35d88ba9-06e8-4265-b713-b65722a14944-kube-api-access-fbh96\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.200100 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.200167 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-config-data\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.213901 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5f884f68c5-j4x5x"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.215375 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.220582 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.220661 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.220586 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w4hgl" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.244798 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f884f68c5-j4x5x"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.277589 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-95cdd9cf8-gbh25"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.300298 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.301690 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc331486-cb31-4169-a564-51f8527ec8dd-logs\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.301841 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbh96\" (UniqueName: \"kubernetes.io/projected/35d88ba9-06e8-4265-b713-b65722a14944-kube-api-access-fbh96\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.301942 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.302038 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.302122 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-combined-ca-bundle\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.302258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-config-data\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.302404 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ll6q\" (UniqueName: \"kubernetes.io/projected/bc331486-cb31-4169-a564-51f8527ec8dd-kube-api-access-8ll6q\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.302505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.302618 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-scripts\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.302731 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data-custom\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.302871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-run-httpd\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.303017 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-log-httpd\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.303471 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-log-httpd\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.307144 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-run-httpd\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.307565 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.314064 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.319074 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-scripts\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.326861 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-config-data\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.331179 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.337825 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-95cdd9cf8-gbh25"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.348747 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbh96\" (UniqueName: \"kubernetes.io/projected/35d88ba9-06e8-4265-b713-b65722a14944-kube-api-access-fbh96\") pod \"ceilometer-0\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.370233 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8454f599bf-f2d66"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.372381 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.391522 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8454f599bf-f2d66"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.406667 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb2gm\" (UniqueName: \"kubernetes.io/projected/3b479d2b-2298-4c68-b5ea-d95813621a27-kube-api-access-fb2gm\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.406739 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc331486-cb31-4169-a564-51f8527ec8dd-logs\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.406862 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.406882 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-combined-ca-bundle\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.406935 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7dfv\" (UniqueName: \"kubernetes.io/projected/aea8c25c-f29f-49ba-ab27-87c8661479ab-kube-api-access-t7dfv\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.406974 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea8c25c-f29f-49ba-ab27-87c8661479ab-logs\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407038 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-nb\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407062 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407106 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ll6q\" (UniqueName: \"kubernetes.io/projected/bc331486-cb31-4169-a564-51f8527ec8dd-kube-api-access-8ll6q\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407131 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-svc\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407173 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-sb\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407197 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data-custom\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407220 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-swift-storage-0\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407254 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data-custom\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407273 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc331486-cb31-4169-a564-51f8527ec8dd-logs\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407278 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-config\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.407354 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-combined-ca-bundle\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.411488 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-combined-ca-bundle\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.413208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data-custom\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.419790 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.432714 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ll6q\" (UniqueName: \"kubernetes.io/projected/bc331486-cb31-4169-a564-51f8527ec8dd-kube-api-access-8ll6q\") pod \"barbican-worker-5f884f68c5-j4x5x\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.452536 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-689f4d67f6-55mbs"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.454539 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.465340 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-689f4d67f6-55mbs"] Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.465428 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.508427 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j47xn\" (UniqueName: \"kubernetes.io/projected/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-kube-api-access-j47xn\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.508476 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2gm\" (UniqueName: \"kubernetes.io/projected/3b479d2b-2298-4c68-b5ea-d95813621a27-kube-api-access-fb2gm\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.508581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7dfv\" (UniqueName: \"kubernetes.io/projected/aea8c25c-f29f-49ba-ab27-87c8661479ab-kube-api-access-t7dfv\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.508650 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea8c25c-f29f-49ba-ab27-87c8661479ab-logs\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.508699 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-combined-ca-bundle\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.508743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-nb\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.508774 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509103 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea8c25c-f29f-49ba-ab27-87c8661479ab-logs\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509438 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-svc\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509459 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-logs\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509489 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-sb\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509517 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-swift-storage-0\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509532 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data-custom\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509577 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-config\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509619 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-combined-ca-bundle\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509637 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509710 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data-custom\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.509729 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-nb\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.510222 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-svc\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.510394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-config\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.511037 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-sb\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.511097 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-swift-storage-0\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.512572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data-custom\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.515563 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.522443 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-combined-ca-bundle\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.523091 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1baa8aa7-e856-4478-8662-26f094036b18" path="/var/lib/kubelet/pods/1baa8aa7-e856-4478-8662-26f094036b18/volumes" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.526101 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb2gm\" (UniqueName: \"kubernetes.io/projected/3b479d2b-2298-4c68-b5ea-d95813621a27-kube-api-access-fb2gm\") pod \"dnsmasq-dns-8454f599bf-f2d66\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.529664 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7dfv\" (UniqueName: \"kubernetes.io/projected/aea8c25c-f29f-49ba-ab27-87c8661479ab-kube-api-access-t7dfv\") pod \"barbican-keystone-listener-95cdd9cf8-gbh25\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.535810 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.571755 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.612451 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-combined-ca-bundle\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.612515 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-logs\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.612569 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.612631 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data-custom\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.612684 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j47xn\" (UniqueName: \"kubernetes.io/projected/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-kube-api-access-j47xn\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.613252 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-logs\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.615806 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-combined-ca-bundle\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.616288 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data-custom\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.617018 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.641240 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j47xn\" (UniqueName: \"kubernetes.io/projected/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-kube-api-access-j47xn\") pod \"barbican-api-689f4d67f6-55mbs\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.792490 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.800120 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.813325 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:04 crc kubenswrapper[4763]: I0930 13:55:04.819934 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.053328 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35d88ba9-06e8-4265-b713-b65722a14944","Type":"ContainerStarted","Data":"dfbfebca04c594748fd98c074159e7c0e47ad75d3bf1972d99aea00fa14cfcee"} Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.106582 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f884f68c5-j4x5x"] Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.311642 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-95cdd9cf8-gbh25"] Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.383284 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-689f4d67f6-55mbs"] Sep 30 13:55:05 crc kubenswrapper[4763]: W0930 13:55:05.393905 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda87af91b_c71e_4d5e_a7d2_10fa502a6dc9.slice/crio-933d9faf7546ba20cb10b6610fb80e7771030beb232c6a0b4ef183f67691dcda WatchSource:0}: Error finding container 933d9faf7546ba20cb10b6610fb80e7771030beb232c6a0b4ef183f67691dcda: Status 404 returned error can't find the container with id 933d9faf7546ba20cb10b6610fb80e7771030beb232c6a0b4ef183f67691dcda Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.397727 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8454f599bf-f2d66"] Sep 30 13:55:05 crc kubenswrapper[4763]: W0930 13:55:05.407453 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b479d2b_2298_4c68_b5ea_d95813621a27.slice/crio-f571752f2b29c07b336541c052a59d62a24ec560d89b48b177c246642e8e221e WatchSource:0}: Error finding container f571752f2b29c07b336541c052a59d62a24ec560d89b48b177c246642e8e221e: Status 404 returned error can't find the container with id f571752f2b29c07b336541c052a59d62a24ec560d89b48b177c246642e8e221e Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.752534 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.754618 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.758297 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6gl4k" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.758668 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.760223 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.775572 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.832454 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sthdv\" (UniqueName: \"kubernetes.io/projected/98e98c9d-b727-4c5b-857b-13064b0ef92f-kube-api-access-sthdv\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.833181 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.833292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.833442 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config-secret\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.912094 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-q6kdz" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.933750 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-combined-ca-bundle\") pod \"7c3f0264-cce9-436f-923d-79f807488437\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.933941 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57dpd\" (UniqueName: \"kubernetes.io/projected/7c3f0264-cce9-436f-923d-79f807488437-kube-api-access-57dpd\") pod \"7c3f0264-cce9-436f-923d-79f807488437\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.934035 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-config-data\") pod \"7c3f0264-cce9-436f-923d-79f807488437\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.934726 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-db-sync-config-data\") pod \"7c3f0264-cce9-436f-923d-79f807488437\" (UID: \"7c3f0264-cce9-436f-923d-79f807488437\") " Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.934904 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config-secret\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.935056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sthdv\" (UniqueName: \"kubernetes.io/projected/98e98c9d-b727-4c5b-857b-13064b0ef92f-kube-api-access-sthdv\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.935181 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.935262 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.937876 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3f0264-cce9-436f-923d-79f807488437-kube-api-access-57dpd" (OuterVolumeSpecName: "kube-api-access-57dpd") pod "7c3f0264-cce9-436f-923d-79f807488437" (UID: "7c3f0264-cce9-436f-923d-79f807488437"). InnerVolumeSpecName "kube-api-access-57dpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.944123 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.945629 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.952143 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7c3f0264-cce9-436f-923d-79f807488437" (UID: "7c3f0264-cce9-436f-923d-79f807488437"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.952148 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config-secret\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.957283 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sthdv\" (UniqueName: \"kubernetes.io/projected/98e98c9d-b727-4c5b-857b-13064b0ef92f-kube-api-access-sthdv\") pod \"openstackclient\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " pod="openstack/openstackclient" Sep 30 13:55:05 crc kubenswrapper[4763]: I0930 13:55:05.984818 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c3f0264-cce9-436f-923d-79f807488437" (UID: "7c3f0264-cce9-436f-923d-79f807488437"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.036421 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57dpd\" (UniqueName: \"kubernetes.io/projected/7c3f0264-cce9-436f-923d-79f807488437-kube-api-access-57dpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.036461 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.036496 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.051356 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-config-data" (OuterVolumeSpecName: "config-data") pod "7c3f0264-cce9-436f-923d-79f807488437" (UID: "7c3f0264-cce9-436f-923d-79f807488437"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.077703 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.081920 4763 generic.go:334] "Generic (PLEG): container finished" podID="3b479d2b-2298-4c68-b5ea-d95813621a27" containerID="6ec914d8533c897a9507ffef847c34272fbb71c58d9a548d165d423e2276eb61" exitCode=0 Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.082045 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" event={"ID":"3b479d2b-2298-4c68-b5ea-d95813621a27","Type":"ContainerDied","Data":"6ec914d8533c897a9507ffef847c34272fbb71c58d9a548d165d423e2276eb61"} Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.082336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" event={"ID":"3b479d2b-2298-4c68-b5ea-d95813621a27","Type":"ContainerStarted","Data":"f571752f2b29c07b336541c052a59d62a24ec560d89b48b177c246642e8e221e"} Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.090019 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f884f68c5-j4x5x" event={"ID":"bc331486-cb31-4169-a564-51f8527ec8dd","Type":"ContainerStarted","Data":"7e847ac3e8459d07c05fadee37d60c5c57e7e16c7493a39a9aba11429525807c"} Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.091695 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" event={"ID":"aea8c25c-f29f-49ba-ab27-87c8661479ab","Type":"ContainerStarted","Data":"a4d7497733080914437aa44414c72e1bd14ae53940dcea5e77876877cb23fa76"} Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.102360 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-689f4d67f6-55mbs" event={"ID":"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9","Type":"ContainerStarted","Data":"cea0a32b6472924f81a8f5a6c9c76fc4d5b769cea550231a768186dd60893a0d"} Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.102428 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-689f4d67f6-55mbs" event={"ID":"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9","Type":"ContainerStarted","Data":"933d9faf7546ba20cb10b6610fb80e7771030beb232c6a0b4ef183f67691dcda"} Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.123020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-q6kdz" event={"ID":"7c3f0264-cce9-436f-923d-79f807488437","Type":"ContainerDied","Data":"3bd44a08b3b2854300fa27f2dbc2e4d0eb17a36cb3ac60ccb5c69b81a11f65a7"} Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.123062 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bd44a08b3b2854300fa27f2dbc2e4d0eb17a36cb3ac60ccb5c69b81a11f65a7" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.123129 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-q6kdz" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.139322 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3f0264-cce9-436f-923d-79f807488437-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.422798 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8454f599bf-f2d66"] Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.472668 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-d4j9z"] Sep 30 13:55:06 crc kubenswrapper[4763]: E0930 13:55:06.473125 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3f0264-cce9-436f-923d-79f807488437" containerName="glance-db-sync" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.473139 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3f0264-cce9-436f-923d-79f807488437" containerName="glance-db-sync" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.473367 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3f0264-cce9-436f-923d-79f807488437" containerName="glance-db-sync" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.474385 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.527311 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-d4j9z"] Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.549446 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.549525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.549674 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbsn2\" (UniqueName: \"kubernetes.io/projected/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-kube-api-access-lbsn2\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.549721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.549887 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-svc\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.549933 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-config\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.654051 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-svc\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.654480 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-config\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.654572 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.654636 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.654741 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.654766 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbsn2\" (UniqueName: \"kubernetes.io/projected/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-kube-api-access-lbsn2\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.655689 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.657452 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-config\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.657518 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.658268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-svc\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.658776 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.676688 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbsn2\" (UniqueName: \"kubernetes.io/projected/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-kube-api-access-lbsn2\") pod \"dnsmasq-dns-5c856dc5f9-d4j9z\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.776867 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 13:55:06 crc kubenswrapper[4763]: I0930 13:55:06.816081 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.093552 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-d4j9z"] Sep 30 13:55:07 crc kubenswrapper[4763]: W0930 13:55:07.106470 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod816d9b64_2c0b_4b24_822d_fe8ec3b6ea9c.slice/crio-e97365fc1ef4b818e47d073550b6b78882a09b14b45a2bf8d3212128e27a1349 WatchSource:0}: Error finding container e97365fc1ef4b818e47d073550b6b78882a09b14b45a2bf8d3212128e27a1349: Status 404 returned error can't find the container with id e97365fc1ef4b818e47d073550b6b78882a09b14b45a2bf8d3212128e27a1349 Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.143105 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"98e98c9d-b727-4c5b-857b-13064b0ef92f","Type":"ContainerStarted","Data":"12f707ba6b4c827c1bbabdede1ef6f690e2e12e2b0dd3f1e50f97b5306348277"} Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.146814 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-689f4d67f6-55mbs" event={"ID":"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9","Type":"ContainerStarted","Data":"759be06ade67c37c53371021ab1577180afe88e157c3115d7ac4ebac302afade"} Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.147230 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.147302 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.151247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" event={"ID":"3b479d2b-2298-4c68-b5ea-d95813621a27","Type":"ContainerStarted","Data":"82988d266cd31d978d2e1496254f0c8ada81437eee0452433b2106a92a23594f"} Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.151408 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" podUID="3b479d2b-2298-4c68-b5ea-d95813621a27" containerName="dnsmasq-dns" containerID="cri-o://82988d266cd31d978d2e1496254f0c8ada81437eee0452433b2106a92a23594f" gracePeriod=10 Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.151538 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.156976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" event={"ID":"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c","Type":"ContainerStarted","Data":"e97365fc1ef4b818e47d073550b6b78882a09b14b45a2bf8d3212128e27a1349"} Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.179279 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35d88ba9-06e8-4265-b713-b65722a14944","Type":"ContainerStarted","Data":"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc"} Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.200738 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-689f4d67f6-55mbs" podStartSLOduration=3.200719492 podStartE2EDuration="3.200719492s" podCreationTimestamp="2025-09-30 13:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:07.173675064 +0000 UTC m=+1179.312235339" watchObservedRunningTime="2025-09-30 13:55:07.200719492 +0000 UTC m=+1179.339279777" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.202091 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" podStartSLOduration=3.202083187 podStartE2EDuration="3.202083187s" podCreationTimestamp="2025-09-30 13:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:07.195110652 +0000 UTC m=+1179.333670957" watchObservedRunningTime="2025-09-30 13:55:07.202083187 +0000 UTC m=+1179.340643472" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.322640 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.324496 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.326869 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xn426" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.327843 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.328390 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.339171 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.478588 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.479617 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-logs\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.479810 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.479938 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.480027 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr4f6\" (UniqueName: \"kubernetes.io/projected/4fce4bf0-7ba2-414b-9eb6-b285923c740a-kube-api-access-tr4f6\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.480111 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.480180 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.578429 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.581871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.585700 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr4f6\" (UniqueName: \"kubernetes.io/projected/4fce4bf0-7ba2-414b-9eb6-b285923c740a-kube-api-access-tr4f6\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.585778 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.585811 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.585992 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.586047 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-logs\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.586258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.586779 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.587610 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.588359 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.589654 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-logs\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.592186 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.599534 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.605988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.620401 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.622286 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr4f6\" (UniqueName: \"kubernetes.io/projected/4fce4bf0-7ba2-414b-9eb6-b285923c740a-kube-api-access-tr4f6\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.646344 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.667753 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.687744 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.687787 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c2rc\" (UniqueName: \"kubernetes.io/projected/05246186-dc4d-4e95-939f-1b49da0c540c-kube-api-access-6c2rc\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.687827 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.687853 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.687891 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-logs\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.687910 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.687928 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.687941 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.789743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c2rc\" (UniqueName: \"kubernetes.io/projected/05246186-dc4d-4e95-939f-1b49da0c540c-kube-api-access-6c2rc\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.789807 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.789840 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.789882 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-logs\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.789901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.790068 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.790494 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.790574 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.791070 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-logs\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.791570 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.795724 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.797360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.797379 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.825805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c2rc\" (UniqueName: \"kubernetes.io/projected/05246186-dc4d-4e95-939f-1b49da0c540c-kube-api-access-6c2rc\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.846764 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.900348 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-9b5dc4bf7-vwl5v"] Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.903776 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.910879 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.911140 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.911259 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.912822 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9b5dc4bf7-vwl5v"] Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.995144 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-log-httpd\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.995204 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-combined-ca-bundle\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.995282 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-public-tls-certs\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.995304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2t89\" (UniqueName: \"kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-kube-api-access-d2t89\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.995323 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-run-httpd\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.995386 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-config-data\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.995416 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-etc-swift\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:07 crc kubenswrapper[4763]: I0930 13:55:07.995467 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-internal-tls-certs\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.011043 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.097662 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-log-httpd\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.097743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-combined-ca-bundle\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.097830 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-public-tls-certs\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.097864 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2t89\" (UniqueName: \"kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-kube-api-access-d2t89\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.097894 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-run-httpd\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.097915 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-config-data\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.097930 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-etc-swift\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.098000 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-internal-tls-certs\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.098242 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-log-httpd\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.101758 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-run-httpd\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.104400 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-config-data\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.106319 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-internal-tls-certs\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.107076 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-etc-swift\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.107731 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-public-tls-certs\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.108067 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-combined-ca-bundle\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.116978 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2t89\" (UniqueName: \"kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-kube-api-access-d2t89\") pod \"swift-proxy-9b5dc4bf7-vwl5v\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.192790 4763 generic.go:334] "Generic (PLEG): container finished" podID="816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" containerID="d4e2a792cf4575c78a6c335d5016196dbad39501ff3a8c770eb5c7136d095ee4" exitCode=0 Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.193175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" event={"ID":"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c","Type":"ContainerDied","Data":"d4e2a792cf4575c78a6c335d5016196dbad39501ff3a8c770eb5c7136d095ee4"} Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.201526 4763 generic.go:334] "Generic (PLEG): container finished" podID="3b479d2b-2298-4c68-b5ea-d95813621a27" containerID="82988d266cd31d978d2e1496254f0c8ada81437eee0452433b2106a92a23594f" exitCode=0 Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.201549 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" event={"ID":"3b479d2b-2298-4c68-b5ea-d95813621a27","Type":"ContainerDied","Data":"82988d266cd31d978d2e1496254f0c8ada81437eee0452433b2106a92a23594f"} Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.234331 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.245007 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-67bf5b69fb-ff2xw"] Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.248704 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.253626 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.253843 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.303519 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67bf5b69fb-ff2xw"] Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.404685 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.404738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-internal-tls-certs\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.404781 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5llr\" (UniqueName: \"kubernetes.io/projected/5ed0d19e-bbae-437d-9083-cded205c65f6-kube-api-access-b5llr\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.404802 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-combined-ca-bundle\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.404839 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data-custom\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.404867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-public-tls-certs\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.404897 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed0d19e-bbae-437d-9083-cded205c65f6-logs\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.518288 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.518371 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-internal-tls-certs\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.520638 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5llr\" (UniqueName: \"kubernetes.io/projected/5ed0d19e-bbae-437d-9083-cded205c65f6-kube-api-access-b5llr\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.520700 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-combined-ca-bundle\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.520777 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data-custom\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.521017 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-public-tls-certs\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.521090 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed0d19e-bbae-437d-9083-cded205c65f6-logs\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.524656 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed0d19e-bbae-437d-9083-cded205c65f6-logs\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.528936 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-combined-ca-bundle\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.529001 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-internal-tls-certs\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.533580 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.535899 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-public-tls-certs\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.538657 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data-custom\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.552190 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5llr\" (UniqueName: \"kubernetes.io/projected/5ed0d19e-bbae-437d-9083-cded205c65f6-kube-api-access-b5llr\") pod \"barbican-api-67bf5b69fb-ff2xw\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.624301 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.842811 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.931370 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-sb\") pod \"3b479d2b-2298-4c68-b5ea-d95813621a27\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.931476 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb2gm\" (UniqueName: \"kubernetes.io/projected/3b479d2b-2298-4c68-b5ea-d95813621a27-kube-api-access-fb2gm\") pod \"3b479d2b-2298-4c68-b5ea-d95813621a27\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.931529 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-svc\") pod \"3b479d2b-2298-4c68-b5ea-d95813621a27\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.931553 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-swift-storage-0\") pod \"3b479d2b-2298-4c68-b5ea-d95813621a27\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.931615 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-config\") pod \"3b479d2b-2298-4c68-b5ea-d95813621a27\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.931650 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-nb\") pod \"3b479d2b-2298-4c68-b5ea-d95813621a27\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " Sep 30 13:55:08 crc kubenswrapper[4763]: I0930 13:55:08.935648 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b479d2b-2298-4c68-b5ea-d95813621a27-kube-api-access-fb2gm" (OuterVolumeSpecName: "kube-api-access-fb2gm") pod "3b479d2b-2298-4c68-b5ea-d95813621a27" (UID: "3b479d2b-2298-4c68-b5ea-d95813621a27"). InnerVolumeSpecName "kube-api-access-fb2gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.010043 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b479d2b-2298-4c68-b5ea-d95813621a27" (UID: "3b479d2b-2298-4c68-b5ea-d95813621a27"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.023271 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b479d2b-2298-4c68-b5ea-d95813621a27" (UID: "3b479d2b-2298-4c68-b5ea-d95813621a27"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.039528 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b479d2b-2298-4c68-b5ea-d95813621a27" (UID: "3b479d2b-2298-4c68-b5ea-d95813621a27"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.035239 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-config" (OuterVolumeSpecName: "config") pod "3b479d2b-2298-4c68-b5ea-d95813621a27" (UID: "3b479d2b-2298-4c68-b5ea-d95813621a27"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.040080 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-svc\") pod \"3b479d2b-2298-4c68-b5ea-d95813621a27\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.040157 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-config\") pod \"3b479d2b-2298-4c68-b5ea-d95813621a27\" (UID: \"3b479d2b-2298-4c68-b5ea-d95813621a27\") " Sep 30 13:55:09 crc kubenswrapper[4763]: W0930 13:55:09.040923 4763 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3b479d2b-2298-4c68-b5ea-d95813621a27/volumes/kubernetes.io~configmap/dns-svc Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.040955 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.040979 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb2gm\" (UniqueName: \"kubernetes.io/projected/3b479d2b-2298-4c68-b5ea-d95813621a27-kube-api-access-fb2gm\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.040991 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.040951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b479d2b-2298-4c68-b5ea-d95813621a27" (UID: "3b479d2b-2298-4c68-b5ea-d95813621a27"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:09 crc kubenswrapper[4763]: W0930 13:55:09.041000 4763 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3b479d2b-2298-4c68-b5ea-d95813621a27/volumes/kubernetes.io~configmap/config Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.041028 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-config" (OuterVolumeSpecName: "config") pod "3b479d2b-2298-4c68-b5ea-d95813621a27" (UID: "3b479d2b-2298-4c68-b5ea-d95813621a27"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.075347 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b479d2b-2298-4c68-b5ea-d95813621a27" (UID: "3b479d2b-2298-4c68-b5ea-d95813621a27"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.143803 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.143829 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.143840 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b479d2b-2298-4c68-b5ea-d95813621a27-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.222731 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.240445 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.240484 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454f599bf-f2d66" event={"ID":"3b479d2b-2298-4c68-b5ea-d95813621a27","Type":"ContainerDied","Data":"f571752f2b29c07b336541c052a59d62a24ec560d89b48b177c246642e8e221e"} Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.240551 4763 scope.go:117] "RemoveContainer" containerID="82988d266cd31d978d2e1496254f0c8ada81437eee0452433b2106a92a23594f" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.251740 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35d88ba9-06e8-4265-b713-b65722a14944","Type":"ContainerStarted","Data":"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47"} Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.342584 4763 scope.go:117] "RemoveContainer" containerID="6ec914d8533c897a9507ffef847c34272fbb71c58d9a548d165d423e2276eb61" Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.357941 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8454f599bf-f2d66"] Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.368179 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8454f599bf-f2d66"] Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.609227 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67bf5b69fb-ff2xw"] Sep 30 13:55:09 crc kubenswrapper[4763]: W0930 13:55:09.666450 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ed0d19e_bbae_437d_9083_cded205c65f6.slice/crio-b1babd7d8f632f71496cd71e56038f714301a0fbc190f6fa9720ac20a4e827ce WatchSource:0}: Error finding container b1babd7d8f632f71496cd71e56038f714301a0fbc190f6fa9720ac20a4e827ce: Status 404 returned error can't find the container with id b1babd7d8f632f71496cd71e56038f714301a0fbc190f6fa9720ac20a4e827ce Sep 30 13:55:09 crc kubenswrapper[4763]: I0930 13:55:09.692338 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.279217 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.280096 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" event={"ID":"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c","Type":"ContainerStarted","Data":"f9a34c1f18e1e6a6311e5d6acc5d8c2592695c8e4e72b64dce792788c444b41b"} Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.280394 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.289280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fce4bf0-7ba2-414b-9eb6-b285923c740a","Type":"ContainerStarted","Data":"f3793df3c53012c94038d3bf19ebe3051fdc7f9c106c0700a747f9e533aa2a03"} Sep 30 13:55:10 crc kubenswrapper[4763]: W0930 13:55:10.289801 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05246186_dc4d_4e95_939f_1b49da0c540c.slice/crio-4ff7272b4d10fd23864b6a8420ee41d0f0b0b134a73e64c67482003f638cba58 WatchSource:0}: Error finding container 4ff7272b4d10fd23864b6a8420ee41d0f0b0b134a73e64c67482003f638cba58: Status 404 returned error can't find the container with id 4ff7272b4d10fd23864b6a8420ee41d0f0b0b134a73e64c67482003f638cba58 Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.302781 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" podStartSLOduration=4.302761542 podStartE2EDuration="4.302761542s" podCreationTimestamp="2025-09-30 13:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:10.298824794 +0000 UTC m=+1182.437385079" watchObservedRunningTime="2025-09-30 13:55:10.302761542 +0000 UTC m=+1182.441321827" Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.310076 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f884f68c5-j4x5x" event={"ID":"bc331486-cb31-4169-a564-51f8527ec8dd","Type":"ContainerStarted","Data":"42b30ec43f1257d28794be7be6660214b1f78e8dcc9ff724d26c8c28a27d8b51"} Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.310120 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f884f68c5-j4x5x" event={"ID":"bc331486-cb31-4169-a564-51f8527ec8dd","Type":"ContainerStarted","Data":"c8f854a8e0e8f8b63357c20a3ee69e40c128f3f024eaa531bfc9fe89a8b73296"} Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.339800 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35d88ba9-06e8-4265-b713-b65722a14944","Type":"ContainerStarted","Data":"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5"} Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.345064 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5f884f68c5-j4x5x" podStartSLOduration=2.523861106 podStartE2EDuration="6.344833539s" podCreationTimestamp="2025-09-30 13:55:04 +0000 UTC" firstStartedPulling="2025-09-30 13:55:05.116732582 +0000 UTC m=+1177.255292867" lastFinishedPulling="2025-09-30 13:55:08.937705015 +0000 UTC m=+1181.076265300" observedRunningTime="2025-09-30 13:55:10.336009427 +0000 UTC m=+1182.474569712" watchObservedRunningTime="2025-09-30 13:55:10.344833539 +0000 UTC m=+1182.483393814" Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.363136 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" event={"ID":"aea8c25c-f29f-49ba-ab27-87c8661479ab","Type":"ContainerStarted","Data":"8a1c727d333559a452f33984696e78504154274594d0d689186dfd04e4589f8b"} Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.363178 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" event={"ID":"aea8c25c-f29f-49ba-ab27-87c8661479ab","Type":"ContainerStarted","Data":"6f12ce438cdfca7bff7ec6b8d59f8bef94cce949cecfbd974e69e68743678f6d"} Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.374365 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67bf5b69fb-ff2xw" event={"ID":"5ed0d19e-bbae-437d-9083-cded205c65f6","Type":"ContainerStarted","Data":"59b75d8a10fde456e075a28d38cdb8ef12838b4b2acfbdbbde03b04350659d72"} Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.374789 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67bf5b69fb-ff2xw" event={"ID":"5ed0d19e-bbae-437d-9083-cded205c65f6","Type":"ContainerStarted","Data":"e8da034aa8e3585dad3aebd273d533766fe99e7f47d29b8c0da60ce7e190c340"} Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.374816 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.374826 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.374834 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67bf5b69fb-ff2xw" event={"ID":"5ed0d19e-bbae-437d-9083-cded205c65f6","Type":"ContainerStarted","Data":"b1babd7d8f632f71496cd71e56038f714301a0fbc190f6fa9720ac20a4e827ce"} Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.415059 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" podStartSLOduration=2.8010111220000002 podStartE2EDuration="6.415036391s" podCreationTimestamp="2025-09-30 13:55:04 +0000 UTC" firstStartedPulling="2025-09-30 13:55:05.322774453 +0000 UTC m=+1177.461334728" lastFinishedPulling="2025-09-30 13:55:08.936799712 +0000 UTC m=+1181.075359997" observedRunningTime="2025-09-30 13:55:10.388854694 +0000 UTC m=+1182.527414979" watchObservedRunningTime="2025-09-30 13:55:10.415036391 +0000 UTC m=+1182.553596676" Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.427993 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-67bf5b69fb-ff2xw" podStartSLOduration=2.4279719650000002 podStartE2EDuration="2.427971965s" podCreationTimestamp="2025-09-30 13:55:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:10.414254861 +0000 UTC m=+1182.552815146" watchObservedRunningTime="2025-09-30 13:55:10.427971965 +0000 UTC m=+1182.566532250" Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.537453 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b479d2b-2298-4c68-b5ea-d95813621a27" path="/var/lib/kubelet/pods/3b479d2b-2298-4c68-b5ea-d95813621a27/volumes" Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.583040 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9b5dc4bf7-vwl5v"] Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.690047 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:10 crc kubenswrapper[4763]: I0930 13:55:10.771854 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:11 crc kubenswrapper[4763]: I0930 13:55:11.419362 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" event={"ID":"a99f7915-f0b7-498a-941d-b02d87df4b98","Type":"ContainerStarted","Data":"3e0c5a3566149a9091d7d20437254c474eb77aa49ad67fd687b671660064adfb"} Sep 30 13:55:11 crc kubenswrapper[4763]: I0930 13:55:11.419673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" event={"ID":"a99f7915-f0b7-498a-941d-b02d87df4b98","Type":"ContainerStarted","Data":"e9053a0e77e480b226c00165f33e24c7498cbace6bd9982da58ebeb4a396e7bd"} Sep 30 13:55:11 crc kubenswrapper[4763]: I0930 13:55:11.430520 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fce4bf0-7ba2-414b-9eb6-b285923c740a","Type":"ContainerStarted","Data":"11e4ee17318053ee5ba015cabc6d5a0147fb781b3c1dd79779d791fcec0064bb"} Sep 30 13:55:11 crc kubenswrapper[4763]: I0930 13:55:11.438700 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"05246186-dc4d-4e95-939f-1b49da0c540c","Type":"ContainerStarted","Data":"4ff7272b4d10fd23864b6a8420ee41d0f0b0b134a73e64c67482003f638cba58"} Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.455083 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="ceilometer-central-agent" containerID="cri-o://5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc" gracePeriod=30 Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.455115 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="proxy-httpd" containerID="cri-o://149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1" gracePeriod=30 Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.455150 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="ceilometer-notification-agent" containerID="cri-o://d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47" gracePeriod=30 Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.455162 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="sg-core" containerID="cri-o://4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5" gracePeriod=30 Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.455202 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35d88ba9-06e8-4265-b713-b65722a14944","Type":"ContainerStarted","Data":"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1"} Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.455750 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.465899 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjgmb" event={"ID":"e000c274-a7a0-493f-a0ea-537e5c474cb0","Type":"ContainerStarted","Data":"1e418fd879a76bc974ec3882d16798e171a7acc9a5c7ba9107b332d8d6aea0fc"} Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.475536 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" event={"ID":"a99f7915-f0b7-498a-941d-b02d87df4b98","Type":"ContainerStarted","Data":"81acae4ba8a1fe31f7b7ec84384f6c7903c26616e109d6de747404a115029a84"} Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.475731 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.476352 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.479385 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fce4bf0-7ba2-414b-9eb6-b285923c740a","Type":"ContainerStarted","Data":"2d3b7bb9d9d5723e530ee66f98c94e5c32c46098021d8a9014e6675556a15deb"} Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.479498 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" containerName="glance-log" containerID="cri-o://11e4ee17318053ee5ba015cabc6d5a0147fb781b3c1dd79779d791fcec0064bb" gracePeriod=30 Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.479661 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" containerName="glance-httpd" containerID="cri-o://2d3b7bb9d9d5723e530ee66f98c94e5c32c46098021d8a9014e6675556a15deb" gracePeriod=30 Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.488329 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.314168153 podStartE2EDuration="8.488309882s" podCreationTimestamp="2025-09-30 13:55:04 +0000 UTC" firstStartedPulling="2025-09-30 13:55:04.860482541 +0000 UTC m=+1176.999042826" lastFinishedPulling="2025-09-30 13:55:11.03462427 +0000 UTC m=+1183.173184555" observedRunningTime="2025-09-30 13:55:12.474412374 +0000 UTC m=+1184.612972659" watchObservedRunningTime="2025-09-30 13:55:12.488309882 +0000 UTC m=+1184.626870167" Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.493100 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="05246186-dc4d-4e95-939f-1b49da0c540c" containerName="glance-log" containerID="cri-o://61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95" gracePeriod=30 Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.493351 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="05246186-dc4d-4e95-939f-1b49da0c540c" containerName="glance-httpd" containerID="cri-o://158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32" gracePeriod=30 Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.490492 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-zjgmb" podStartSLOduration=5.317947565 podStartE2EDuration="50.490483067s" podCreationTimestamp="2025-09-30 13:54:22 +0000 UTC" firstStartedPulling="2025-09-30 13:54:24.822522699 +0000 UTC m=+1136.961082984" lastFinishedPulling="2025-09-30 13:55:09.995058201 +0000 UTC m=+1182.133618486" observedRunningTime="2025-09-30 13:55:12.489204875 +0000 UTC m=+1184.627765160" watchObservedRunningTime="2025-09-30 13:55:12.490483067 +0000 UTC m=+1184.629043352" Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.524938 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"05246186-dc4d-4e95-939f-1b49da0c540c","Type":"ContainerStarted","Data":"158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32"} Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.524983 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"05246186-dc4d-4e95-939f-1b49da0c540c","Type":"ContainerStarted","Data":"61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95"} Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.524999 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" podStartSLOduration=5.524986764 podStartE2EDuration="5.524986764s" podCreationTimestamp="2025-09-30 13:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:12.517442014 +0000 UTC m=+1184.656002299" watchObservedRunningTime="2025-09-30 13:55:12.524986764 +0000 UTC m=+1184.663547039" Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.553940 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.553917919 podStartE2EDuration="6.553917919s" podCreationTimestamp="2025-09-30 13:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:12.540058951 +0000 UTC m=+1184.678619236" watchObservedRunningTime="2025-09-30 13:55:12.553917919 +0000 UTC m=+1184.692478204" Sep 30 13:55:12 crc kubenswrapper[4763]: I0930 13:55:12.570305 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.57028588 podStartE2EDuration="6.57028588s" podCreationTimestamp="2025-09-30 13:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:12.567765867 +0000 UTC m=+1184.706326152" watchObservedRunningTime="2025-09-30 13:55:12.57028588 +0000 UTC m=+1184.708846165" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.310533 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.441801 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-scripts\") pod \"05246186-dc4d-4e95-939f-1b49da0c540c\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.441855 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c2rc\" (UniqueName: \"kubernetes.io/projected/05246186-dc4d-4e95-939f-1b49da0c540c-kube-api-access-6c2rc\") pod \"05246186-dc4d-4e95-939f-1b49da0c540c\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.441885 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"05246186-dc4d-4e95-939f-1b49da0c540c\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.441979 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-config-data\") pod \"05246186-dc4d-4e95-939f-1b49da0c540c\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.442059 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-httpd-run\") pod \"05246186-dc4d-4e95-939f-1b49da0c540c\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.442106 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-logs\") pod \"05246186-dc4d-4e95-939f-1b49da0c540c\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.442133 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-combined-ca-bundle\") pod \"05246186-dc4d-4e95-939f-1b49da0c540c\" (UID: \"05246186-dc4d-4e95-939f-1b49da0c540c\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.442415 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "05246186-dc4d-4e95-939f-1b49da0c540c" (UID: "05246186-dc4d-4e95-939f-1b49da0c540c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.442542 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-logs" (OuterVolumeSpecName: "logs") pod "05246186-dc4d-4e95-939f-1b49da0c540c" (UID: "05246186-dc4d-4e95-939f-1b49da0c540c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.442636 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.450835 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "05246186-dc4d-4e95-939f-1b49da0c540c" (UID: "05246186-dc4d-4e95-939f-1b49da0c540c"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.451763 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05246186-dc4d-4e95-939f-1b49da0c540c-kube-api-access-6c2rc" (OuterVolumeSpecName: "kube-api-access-6c2rc") pod "05246186-dc4d-4e95-939f-1b49da0c540c" (UID: "05246186-dc4d-4e95-939f-1b49da0c540c"). InnerVolumeSpecName "kube-api-access-6c2rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.454741 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-scripts" (OuterVolumeSpecName: "scripts") pod "05246186-dc4d-4e95-939f-1b49da0c540c" (UID: "05246186-dc4d-4e95-939f-1b49da0c540c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.528674 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05246186-dc4d-4e95-939f-1b49da0c540c" (UID: "05246186-dc4d-4e95-939f-1b49da0c540c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.539538 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-config-data" (OuterVolumeSpecName: "config-data") pod "05246186-dc4d-4e95-939f-1b49da0c540c" (UID: "05246186-dc4d-4e95-939f-1b49da0c540c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.549026 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c2rc\" (UniqueName: \"kubernetes.io/projected/05246186-dc4d-4e95-939f-1b49da0c540c-kube-api-access-6c2rc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.549089 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.549100 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.549117 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05246186-dc4d-4e95-939f-1b49da0c540c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.549127 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.549136 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05246186-dc4d-4e95-939f-1b49da0c540c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.585379 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.593544 4763 generic.go:334] "Generic (PLEG): container finished" podID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" containerID="2d3b7bb9d9d5723e530ee66f98c94e5c32c46098021d8a9014e6675556a15deb" exitCode=143 Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.593576 4763 generic.go:334] "Generic (PLEG): container finished" podID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" containerID="11e4ee17318053ee5ba015cabc6d5a0147fb781b3c1dd79779d791fcec0064bb" exitCode=143 Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.593630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fce4bf0-7ba2-414b-9eb6-b285923c740a","Type":"ContainerDied","Data":"2d3b7bb9d9d5723e530ee66f98c94e5c32c46098021d8a9014e6675556a15deb"} Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.593660 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fce4bf0-7ba2-414b-9eb6-b285923c740a","Type":"ContainerDied","Data":"11e4ee17318053ee5ba015cabc6d5a0147fb781b3c1dd79779d791fcec0064bb"} Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.595488 4763 generic.go:334] "Generic (PLEG): container finished" podID="05246186-dc4d-4e95-939f-1b49da0c540c" containerID="158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32" exitCode=143 Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.595509 4763 generic.go:334] "Generic (PLEG): container finished" podID="05246186-dc4d-4e95-939f-1b49da0c540c" containerID="61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95" exitCode=143 Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.595539 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"05246186-dc4d-4e95-939f-1b49da0c540c","Type":"ContainerDied","Data":"158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32"} Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.595554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"05246186-dc4d-4e95-939f-1b49da0c540c","Type":"ContainerDied","Data":"61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95"} Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.595564 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"05246186-dc4d-4e95-939f-1b49da0c540c","Type":"ContainerDied","Data":"4ff7272b4d10fd23864b6a8420ee41d0f0b0b134a73e64c67482003f638cba58"} Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.595578 4763 scope.go:117] "RemoveContainer" containerID="158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.595705 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.598870 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.605846 4763 generic.go:334] "Generic (PLEG): container finished" podID="35d88ba9-06e8-4265-b713-b65722a14944" containerID="149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1" exitCode=0 Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.605874 4763 generic.go:334] "Generic (PLEG): container finished" podID="35d88ba9-06e8-4265-b713-b65722a14944" containerID="4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5" exitCode=2 Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.605887 4763 generic.go:334] "Generic (PLEG): container finished" podID="35d88ba9-06e8-4265-b713-b65722a14944" containerID="d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47" exitCode=0 Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.605897 4763 generic.go:334] "Generic (PLEG): container finished" podID="35d88ba9-06e8-4265-b713-b65722a14944" containerID="5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc" exitCode=0 Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.605899 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35d88ba9-06e8-4265-b713-b65722a14944","Type":"ContainerDied","Data":"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1"} Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.605958 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35d88ba9-06e8-4265-b713-b65722a14944","Type":"ContainerDied","Data":"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5"} Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.605972 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35d88ba9-06e8-4265-b713-b65722a14944","Type":"ContainerDied","Data":"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47"} Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.605981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35d88ba9-06e8-4265-b713-b65722a14944","Type":"ContainerDied","Data":"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc"} Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.612004 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.613132 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.650553 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbh96\" (UniqueName: \"kubernetes.io/projected/35d88ba9-06e8-4265-b713-b65722a14944-kube-api-access-fbh96\") pod \"35d88ba9-06e8-4265-b713-b65722a14944\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.650638 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-sg-core-conf-yaml\") pod \"35d88ba9-06e8-4265-b713-b65722a14944\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.650801 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-log-httpd\") pod \"35d88ba9-06e8-4265-b713-b65722a14944\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.650864 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-combined-ca-bundle\") pod \"35d88ba9-06e8-4265-b713-b65722a14944\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.650920 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-scripts\") pod \"35d88ba9-06e8-4265-b713-b65722a14944\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.650996 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-run-httpd\") pod \"35d88ba9-06e8-4265-b713-b65722a14944\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.651023 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-config-data\") pod \"35d88ba9-06e8-4265-b713-b65722a14944\" (UID: \"35d88ba9-06e8-4265-b713-b65722a14944\") " Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.651682 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "35d88ba9-06e8-4265-b713-b65722a14944" (UID: "35d88ba9-06e8-4265-b713-b65722a14944"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.651810 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "35d88ba9-06e8-4265-b713-b65722a14944" (UID: "35d88ba9-06e8-4265-b713-b65722a14944"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.657922 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.657950 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35d88ba9-06e8-4265-b713-b65722a14944-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.657965 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.659735 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-scripts" (OuterVolumeSpecName: "scripts") pod "35d88ba9-06e8-4265-b713-b65722a14944" (UID: "35d88ba9-06e8-4265-b713-b65722a14944"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.669578 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d88ba9-06e8-4265-b713-b65722a14944-kube-api-access-fbh96" (OuterVolumeSpecName: "kube-api-access-fbh96") pod "35d88ba9-06e8-4265-b713-b65722a14944" (UID: "35d88ba9-06e8-4265-b713-b65722a14944"). InnerVolumeSpecName "kube-api-access-fbh96". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.730487 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "35d88ba9-06e8-4265-b713-b65722a14944" (UID: "35d88ba9-06e8-4265-b713-b65722a14944"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.759845 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.759874 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbh96\" (UniqueName: \"kubernetes.io/projected/35d88ba9-06e8-4265-b713-b65722a14944-kube-api-access-fbh96\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.759887 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.776133 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35d88ba9-06e8-4265-b713-b65722a14944" (UID: "35d88ba9-06e8-4265-b713-b65722a14944"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.795410 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-config-data" (OuterVolumeSpecName: "config-data") pod "35d88ba9-06e8-4265-b713-b65722a14944" (UID: "35d88ba9-06e8-4265-b713-b65722a14944"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.862182 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.862219 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d88ba9-06e8-4265-b713-b65722a14944-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.881107 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.881830 4763 scope.go:117] "RemoveContainer" containerID="61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.905673 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.921528 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:13 crc kubenswrapper[4763]: E0930 13:55:13.921992 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="proxy-httpd" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922015 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="proxy-httpd" Sep 30 13:55:13 crc kubenswrapper[4763]: E0930 13:55:13.922027 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05246186-dc4d-4e95-939f-1b49da0c540c" containerName="glance-log" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922032 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="05246186-dc4d-4e95-939f-1b49da0c540c" containerName="glance-log" Sep 30 13:55:13 crc kubenswrapper[4763]: E0930 13:55:13.922050 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b479d2b-2298-4c68-b5ea-d95813621a27" containerName="init" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922056 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b479d2b-2298-4c68-b5ea-d95813621a27" containerName="init" Sep 30 13:55:13 crc kubenswrapper[4763]: E0930 13:55:13.922066 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05246186-dc4d-4e95-939f-1b49da0c540c" containerName="glance-httpd" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922072 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="05246186-dc4d-4e95-939f-1b49da0c540c" containerName="glance-httpd" Sep 30 13:55:13 crc kubenswrapper[4763]: E0930 13:55:13.922089 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b479d2b-2298-4c68-b5ea-d95813621a27" containerName="dnsmasq-dns" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922094 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b479d2b-2298-4c68-b5ea-d95813621a27" containerName="dnsmasq-dns" Sep 30 13:55:13 crc kubenswrapper[4763]: E0930 13:55:13.922112 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="ceilometer-central-agent" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922118 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="ceilometer-central-agent" Sep 30 13:55:13 crc kubenswrapper[4763]: E0930 13:55:13.922131 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="ceilometer-notification-agent" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922137 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="ceilometer-notification-agent" Sep 30 13:55:13 crc kubenswrapper[4763]: E0930 13:55:13.922149 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="sg-core" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922155 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="sg-core" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922318 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="05246186-dc4d-4e95-939f-1b49da0c540c" containerName="glance-log" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922332 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="ceilometer-central-agent" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922343 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="05246186-dc4d-4e95-939f-1b49da0c540c" containerName="glance-httpd" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922351 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="proxy-httpd" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922365 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="ceilometer-notification-agent" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922380 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b479d2b-2298-4c68-b5ea-d95813621a27" containerName="dnsmasq-dns" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.922390 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d88ba9-06e8-4265-b713-b65722a14944" containerName="sg-core" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.923411 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.925542 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.926371 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.939657 4763 scope.go:117] "RemoveContainer" containerID="158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.940870 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:13 crc kubenswrapper[4763]: E0930 13:55:13.943154 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32\": container with ID starting with 158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32 not found: ID does not exist" containerID="158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.943251 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32"} err="failed to get container status \"158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32\": rpc error: code = NotFound desc = could not find container \"158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32\": container with ID starting with 158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32 not found: ID does not exist" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.943300 4763 scope.go:117] "RemoveContainer" containerID="61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95" Sep 30 13:55:13 crc kubenswrapper[4763]: E0930 13:55:13.943861 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95\": container with ID starting with 61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95 not found: ID does not exist" containerID="61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.943892 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95"} err="failed to get container status \"61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95\": rpc error: code = NotFound desc = could not find container \"61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95\": container with ID starting with 61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95 not found: ID does not exist" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.943908 4763 scope.go:117] "RemoveContainer" containerID="158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.954161 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32"} err="failed to get container status \"158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32\": rpc error: code = NotFound desc = could not find container \"158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32\": container with ID starting with 158884e5ce6cf2194e4fac541f65bad18cdb59e35bc17336210821e786768e32 not found: ID does not exist" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.954205 4763 scope.go:117] "RemoveContainer" containerID="61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.954699 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95"} err="failed to get container status \"61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95\": rpc error: code = NotFound desc = could not find container \"61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95\": container with ID starting with 61f0165124c3a7a780a5ff7ff9c4a41af70ea44ac14780e4b1205361ec6a4f95 not found: ID does not exist" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.954745 4763 scope.go:117] "RemoveContainer" containerID="149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1" Sep 30 13:55:13 crc kubenswrapper[4763]: I0930 13:55:13.973372 4763 scope.go:117] "RemoveContainer" containerID="4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.006788 4763 scope.go:117] "RemoveContainer" containerID="d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.024332 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.053232 4763 scope.go:117] "RemoveContainer" containerID="5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.066091 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.066169 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.066202 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.066258 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.066316 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.066440 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnw98\" (UniqueName: \"kubernetes.io/projected/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-kube-api-access-gnw98\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.067234 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.067300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.080943 4763 scope.go:117] "RemoveContainer" containerID="149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1" Sep 30 13:55:14 crc kubenswrapper[4763]: E0930 13:55:14.081427 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1\": container with ID starting with 149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1 not found: ID does not exist" containerID="149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.081463 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1"} err="failed to get container status \"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1\": rpc error: code = NotFound desc = could not find container \"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1\": container with ID starting with 149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1 not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.081488 4763 scope.go:117] "RemoveContainer" containerID="4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5" Sep 30 13:55:14 crc kubenswrapper[4763]: E0930 13:55:14.081884 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5\": container with ID starting with 4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5 not found: ID does not exist" containerID="4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.081938 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5"} err="failed to get container status \"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5\": rpc error: code = NotFound desc = could not find container \"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5\": container with ID starting with 4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5 not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.081967 4763 scope.go:117] "RemoveContainer" containerID="d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47" Sep 30 13:55:14 crc kubenswrapper[4763]: E0930 13:55:14.082245 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47\": container with ID starting with d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47 not found: ID does not exist" containerID="d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.082281 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47"} err="failed to get container status \"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47\": rpc error: code = NotFound desc = could not find container \"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47\": container with ID starting with d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47 not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.082304 4763 scope.go:117] "RemoveContainer" containerID="5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc" Sep 30 13:55:14 crc kubenswrapper[4763]: E0930 13:55:14.082547 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc\": container with ID starting with 5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc not found: ID does not exist" containerID="5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.082588 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc"} err="failed to get container status \"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc\": rpc error: code = NotFound desc = could not find container \"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc\": container with ID starting with 5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.082637 4763 scope.go:117] "RemoveContainer" containerID="149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.083152 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1"} err="failed to get container status \"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1\": rpc error: code = NotFound desc = could not find container \"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1\": container with ID starting with 149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1 not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.083179 4763 scope.go:117] "RemoveContainer" containerID="4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.083396 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5"} err="failed to get container status \"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5\": rpc error: code = NotFound desc = could not find container \"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5\": container with ID starting with 4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5 not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.083416 4763 scope.go:117] "RemoveContainer" containerID="d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.083664 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47"} err="failed to get container status \"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47\": rpc error: code = NotFound desc = could not find container \"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47\": container with ID starting with d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47 not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.083678 4763 scope.go:117] "RemoveContainer" containerID="5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.084074 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc"} err="failed to get container status \"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc\": rpc error: code = NotFound desc = could not find container \"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc\": container with ID starting with 5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.084118 4763 scope.go:117] "RemoveContainer" containerID="149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.084484 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1"} err="failed to get container status \"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1\": rpc error: code = NotFound desc = could not find container \"149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1\": container with ID starting with 149cc41880c25eaa1676ec9fe6e32d7a997808a05c118a3dea04b5dd340526d1 not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.084505 4763 scope.go:117] "RemoveContainer" containerID="4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.084844 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5"} err="failed to get container status \"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5\": rpc error: code = NotFound desc = could not find container \"4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5\": container with ID starting with 4f681075cad7ff68619b3ac42ff96ff18c8522e4fedbefc64ea4d87e8d7928a5 not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.084896 4763 scope.go:117] "RemoveContainer" containerID="d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.085198 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47"} err="failed to get container status \"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47\": rpc error: code = NotFound desc = could not find container \"d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47\": container with ID starting with d03c0db9f314802ddd0b971107f35e6b11b9e3534f36151cabb7a577b4a03a47 not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.085220 4763 scope.go:117] "RemoveContainer" containerID="5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.085443 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc"} err="failed to get container status \"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc\": rpc error: code = NotFound desc = could not find container \"5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc\": container with ID starting with 5f6fba88083f39dfebbc3fd396a75ddf0c8feeecad796a17af9a021693e852fc not found: ID does not exist" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.168677 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-scripts\") pod \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.168815 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-config-data\") pod \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.168863 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-combined-ca-bundle\") pod \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.169561 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.169656 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr4f6\" (UniqueName: \"kubernetes.io/projected/4fce4bf0-7ba2-414b-9eb6-b285923c740a-kube-api-access-tr4f6\") pod \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.169704 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-logs\") pod \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.170277 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-logs" (OuterVolumeSpecName: "logs") pod "4fce4bf0-7ba2-414b-9eb6-b285923c740a" (UID: "4fce4bf0-7ba2-414b-9eb6-b285923c740a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.170348 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-httpd-run\") pod \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\" (UID: \"4fce4bf0-7ba2-414b-9eb6-b285923c740a\") " Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.170591 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4fce4bf0-7ba2-414b-9eb6-b285923c740a" (UID: "4fce4bf0-7ba2-414b-9eb6-b285923c740a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.170895 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.173485 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnw98\" (UniqueName: \"kubernetes.io/projected/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-kube-api-access-gnw98\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.173546 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.173590 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.173793 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.173870 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.173890 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.173958 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.174088 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.174109 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fce4bf0-7ba2-414b-9eb6-b285923c740a-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.173951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fce4bf0-7ba2-414b-9eb6-b285923c740a-kube-api-access-tr4f6" (OuterVolumeSpecName: "kube-api-access-tr4f6") pod "4fce4bf0-7ba2-414b-9eb6-b285923c740a" (UID: "4fce4bf0-7ba2-414b-9eb6-b285923c740a"). InnerVolumeSpecName "kube-api-access-tr4f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.174317 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.174425 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.174812 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.179962 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "4fce4bf0-7ba2-414b-9eb6-b285923c740a" (UID: "4fce4bf0-7ba2-414b-9eb6-b285923c740a"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.180073 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-scripts" (OuterVolumeSpecName: "scripts") pod "4fce4bf0-7ba2-414b-9eb6-b285923c740a" (UID: "4fce4bf0-7ba2-414b-9eb6-b285923c740a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.180297 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.180351 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.180736 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.183208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.188335 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnw98\" (UniqueName: \"kubernetes.io/projected/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-kube-api-access-gnw98\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.211195 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fce4bf0-7ba2-414b-9eb6-b285923c740a" (UID: "4fce4bf0-7ba2-414b-9eb6-b285923c740a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.211615 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.237557 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-config-data" (OuterVolumeSpecName: "config-data") pod "4fce4bf0-7ba2-414b-9eb6-b285923c740a" (UID: "4fce4bf0-7ba2-414b-9eb6-b285923c740a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.247223 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.276354 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.276392 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.276437 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.276451 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr4f6\" (UniqueName: \"kubernetes.io/projected/4fce4bf0-7ba2-414b-9eb6-b285923c740a-kube-api-access-tr4f6\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.276464 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fce4bf0-7ba2-414b-9eb6-b285923c740a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.327344 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.378492 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.511439 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05246186-dc4d-4e95-939f-1b49da0c540c" path="/var/lib/kubelet/pods/05246186-dc4d-4e95-939f-1b49da0c540c/volumes" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.620892 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fce4bf0-7ba2-414b-9eb6-b285923c740a","Type":"ContainerDied","Data":"f3793df3c53012c94038d3bf19ebe3051fdc7f9c106c0700a747f9e533aa2a03"} Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.620945 4763 scope.go:117] "RemoveContainer" containerID="2d3b7bb9d9d5723e530ee66f98c94e5c32c46098021d8a9014e6675556a15deb" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.621041 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.629368 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.630028 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35d88ba9-06e8-4265-b713-b65722a14944","Type":"ContainerDied","Data":"dfbfebca04c594748fd98c074159e7c0e47ad75d3bf1972d99aea00fa14cfcee"} Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.711899 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.725045 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.735681 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.745412 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.749428 4763 scope.go:117] "RemoveContainer" containerID="11e4ee17318053ee5ba015cabc6d5a0147fb781b3c1dd79779d791fcec0064bb" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.754438 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:14 crc kubenswrapper[4763]: E0930 13:55:14.755497 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" containerName="glance-log" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.755516 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" containerName="glance-log" Sep 30 13:55:14 crc kubenswrapper[4763]: E0930 13:55:14.755531 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" containerName="glance-httpd" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.755537 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" containerName="glance-httpd" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.756856 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" containerName="glance-log" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.756926 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" containerName="glance-httpd" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.759782 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.767142 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.767169 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.767339 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.772056 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.774673 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.781292 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.781568 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.784505 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.892448 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-logs\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.892752 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxv62\" (UniqueName: \"kubernetes.io/projected/f9249dc6-ff96-4198-b25e-7362067617ab-kube-api-access-vxv62\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.892837 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.893021 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-scripts\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.893085 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-config-data\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.893252 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-run-httpd\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.893314 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.893335 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvl9z\" (UniqueName: \"kubernetes.io/projected/3acbf6d3-0af6-49df-9884-7a79660c0d38-kube-api-access-fvl9z\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.893356 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-log-httpd\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.894095 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.895257 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.895296 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-scripts\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.895342 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.895358 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-config-data\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.895432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.996763 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-logs\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.996834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxv62\" (UniqueName: \"kubernetes.io/projected/f9249dc6-ff96-4198-b25e-7362067617ab-kube-api-access-vxv62\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.996873 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.996914 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-scripts\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.996936 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-config-data\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.996989 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-run-httpd\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997012 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997036 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvl9z\" (UniqueName: \"kubernetes.io/projected/3acbf6d3-0af6-49df-9884-7a79660c0d38-kube-api-access-fvl9z\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-log-httpd\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997093 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997162 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997184 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-scripts\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997207 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997227 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-config-data\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997269 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997272 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-logs\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997586 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-log-httpd\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997635 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-run-httpd\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.997915 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:14 crc kubenswrapper[4763]: I0930 13:55:14.999374 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.003819 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.004804 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.005080 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-scripts\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.005380 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.008223 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-scripts\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.010098 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.011315 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-config-data\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.014691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-config-data\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.019207 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxv62\" (UniqueName: \"kubernetes.io/projected/f9249dc6-ff96-4198-b25e-7362067617ab-kube-api-access-vxv62\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.021474 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.030119 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvl9z\" (UniqueName: \"kubernetes.io/projected/3acbf6d3-0af6-49df-9884-7a79660c0d38-kube-api-access-fvl9z\") pod \"ceilometer-0\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " pod="openstack/ceilometer-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.037820 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:15 crc kubenswrapper[4763]: W0930 13:55:15.041197 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ce2bb5a_59f2_44ca_92ef_4b98681acdc6.slice/crio-0bb68e228b6fe7d933a8f605d2eb3f42b9e3c06c8345878b91c9779ed1a22422 WatchSource:0}: Error finding container 0bb68e228b6fe7d933a8f605d2eb3f42b9e3c06c8345878b91c9779ed1a22422: Status 404 returned error can't find the container with id 0bb68e228b6fe7d933a8f605d2eb3f42b9e3c06c8345878b91c9779ed1a22422 Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.095635 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.117069 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.642461 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6","Type":"ContainerStarted","Data":"0bb68e228b6fe7d933a8f605d2eb3f42b9e3c06c8345878b91c9779ed1a22422"} Sep 30 13:55:15 crc kubenswrapper[4763]: I0930 13:55:15.918460 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.032785 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.502854 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35d88ba9-06e8-4265-b713-b65722a14944" path="/var/lib/kubelet/pods/35d88ba9-06e8-4265-b713-b65722a14944/volumes" Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.504106 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fce4bf0-7ba2-414b-9eb6-b285923c740a" path="/var/lib/kubelet/pods/4fce4bf0-7ba2-414b-9eb6-b285923c740a/volumes" Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.659631 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3acbf6d3-0af6-49df-9884-7a79660c0d38","Type":"ContainerStarted","Data":"81e6cb704768bfad243ca58e76052eea4884346782b0bae2d0c113da32235d49"} Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.663936 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6","Type":"ContainerStarted","Data":"a61f9690c80b1c238dbd56d88a7ef04eb303c9d58661a1e01a731d1e3c8f2914"} Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.663986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6","Type":"ContainerStarted","Data":"9c8915719c91e41929906d9743d00233d0186243330a277c24adefac9170aee0"} Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.664840 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f9249dc6-ff96-4198-b25e-7362067617ab","Type":"ContainerStarted","Data":"e584877617c206f005dc1e88ac507f72d7994b10804d6e1690868effe46703be"} Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.818769 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.880479 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-h46w6"] Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.880902 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" podUID="6386172b-d1f8-4dd4-897e-cd58a6acf678" containerName="dnsmasq-dns" containerID="cri-o://8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927" gracePeriod=10 Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.931836 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:16 crc kubenswrapper[4763]: I0930 13:55:16.999222 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.574071 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.657197 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt5s7\" (UniqueName: \"kubernetes.io/projected/6386172b-d1f8-4dd4-897e-cd58a6acf678-kube-api-access-mt5s7\") pod \"6386172b-d1f8-4dd4-897e-cd58a6acf678\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.657246 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-config\") pod \"6386172b-d1f8-4dd4-897e-cd58a6acf678\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.657284 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-swift-storage-0\") pod \"6386172b-d1f8-4dd4-897e-cd58a6acf678\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.657313 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-svc\") pod \"6386172b-d1f8-4dd4-897e-cd58a6acf678\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.657335 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-nb\") pod \"6386172b-d1f8-4dd4-897e-cd58a6acf678\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.657517 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-sb\") pod \"6386172b-d1f8-4dd4-897e-cd58a6acf678\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.666821 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6386172b-d1f8-4dd4-897e-cd58a6acf678-kube-api-access-mt5s7" (OuterVolumeSpecName: "kube-api-access-mt5s7") pod "6386172b-d1f8-4dd4-897e-cd58a6acf678" (UID: "6386172b-d1f8-4dd4-897e-cd58a6acf678"). InnerVolumeSpecName "kube-api-access-mt5s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.708471 4763 generic.go:334] "Generic (PLEG): container finished" podID="6386172b-d1f8-4dd4-897e-cd58a6acf678" containerID="8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927" exitCode=0 Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.708552 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" event={"ID":"6386172b-d1f8-4dd4-897e-cd58a6acf678","Type":"ContainerDied","Data":"8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927"} Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.708583 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" event={"ID":"6386172b-d1f8-4dd4-897e-cd58a6acf678","Type":"ContainerDied","Data":"d7b88f1a9ebf6a7a57db309be0ed500a8edc9cfe637265e03d060ad66e389c0c"} Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.708619 4763 scope.go:117] "RemoveContainer" containerID="8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.708790 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4dc449d9-h46w6" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.723460 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f9249dc6-ff96-4198-b25e-7362067617ab","Type":"ContainerStarted","Data":"4d8542f41273e622d163dbe4ad7305ecf9a69c0a41ec43ac9d60f5cdcb8a0328"} Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.768581 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt5s7\" (UniqueName: \"kubernetes.io/projected/6386172b-d1f8-4dd4-897e-cd58a6acf678-kube-api-access-mt5s7\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.775555 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.775531644 podStartE2EDuration="4.775531644s" podCreationTimestamp="2025-09-30 13:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:17.754904206 +0000 UTC m=+1189.893464491" watchObservedRunningTime="2025-09-30 13:55:17.775531644 +0000 UTC m=+1189.914091929" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.794930 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6386172b-d1f8-4dd4-897e-cd58a6acf678" (UID: "6386172b-d1f8-4dd4-897e-cd58a6acf678"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.794907 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6386172b-d1f8-4dd4-897e-cd58a6acf678" (UID: "6386172b-d1f8-4dd4-897e-cd58a6acf678"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.836083 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6386172b-d1f8-4dd4-897e-cd58a6acf678" (UID: "6386172b-d1f8-4dd4-897e-cd58a6acf678"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:17 crc kubenswrapper[4763]: E0930 13:55:17.855019 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-config podName:6386172b-d1f8-4dd4-897e-cd58a6acf678 nodeName:}" failed. No retries permitted until 2025-09-30 13:55:18.354992557 +0000 UTC m=+1190.493552842 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-config") pod "6386172b-d1f8-4dd4-897e-cd58a6acf678" (UID: "6386172b-d1f8-4dd4-897e-cd58a6acf678") : error deleting /var/lib/kubelet/pods/6386172b-d1f8-4dd4-897e-cd58a6acf678/volume-subpaths: remove /var/lib/kubelet/pods/6386172b-d1f8-4dd4-897e-cd58a6acf678/volume-subpaths: no such file or directory Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.855308 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6386172b-d1f8-4dd4-897e-cd58a6acf678" (UID: "6386172b-d1f8-4dd4-897e-cd58a6acf678"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.870721 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.870757 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.870770 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:17 crc kubenswrapper[4763]: I0930 13:55:17.870782 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:18 crc kubenswrapper[4763]: I0930 13:55:18.246042 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:18 crc kubenswrapper[4763]: I0930 13:55:18.246988 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:55:18 crc kubenswrapper[4763]: I0930 13:55:18.379304 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-config\") pod \"6386172b-d1f8-4dd4-897e-cd58a6acf678\" (UID: \"6386172b-d1f8-4dd4-897e-cd58a6acf678\") " Sep 30 13:55:18 crc kubenswrapper[4763]: I0930 13:55:18.380992 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-config" (OuterVolumeSpecName: "config") pod "6386172b-d1f8-4dd4-897e-cd58a6acf678" (UID: "6386172b-d1f8-4dd4-897e-cd58a6acf678"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:18 crc kubenswrapper[4763]: I0930 13:55:18.481463 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6386172b-d1f8-4dd4-897e-cd58a6acf678-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:18 crc kubenswrapper[4763]: I0930 13:55:18.640168 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-h46w6"] Sep 30 13:55:18 crc kubenswrapper[4763]: I0930 13:55:18.651235 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-h46w6"] Sep 30 13:55:18 crc kubenswrapper[4763]: I0930 13:55:18.734300 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f9249dc6-ff96-4198-b25e-7362067617ab","Type":"ContainerStarted","Data":"0a9f5f188ca7ae4e254395e48120dcd08ab2e6b2c8ba46ff6fd04b550ba23940"} Sep 30 13:55:18 crc kubenswrapper[4763]: I0930 13:55:18.761771 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.761754794 podStartE2EDuration="4.761754794s" podCreationTimestamp="2025-09-30 13:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:18.755726633 +0000 UTC m=+1190.894286918" watchObservedRunningTime="2025-09-30 13:55:18.761754794 +0000 UTC m=+1190.900315079" Sep 30 13:55:19 crc kubenswrapper[4763]: I0930 13:55:19.743822 4763 generic.go:334] "Generic (PLEG): container finished" podID="96adbfe1-e6f8-4460-b999-a213cb396c4b" containerID="0bd45db8372dd13c7f8b6af5a8f009d51c1365ea9b4c411b536ad6bd5ed5a9de" exitCode=0 Sep 30 13:55:19 crc kubenswrapper[4763]: I0930 13:55:19.745194 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xmvws" event={"ID":"96adbfe1-e6f8-4460-b999-a213cb396c4b","Type":"ContainerDied","Data":"0bd45db8372dd13c7f8b6af5a8f009d51c1365ea9b4c411b536ad6bd5ed5a9de"} Sep 30 13:55:20 crc kubenswrapper[4763]: I0930 13:55:20.229440 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:20 crc kubenswrapper[4763]: I0930 13:55:20.255038 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:55:20 crc kubenswrapper[4763]: I0930 13:55:20.334002 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-689f4d67f6-55mbs"] Sep 30 13:55:20 crc kubenswrapper[4763]: I0930 13:55:20.334514 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-689f4d67f6-55mbs" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerName="barbican-api-log" containerID="cri-o://cea0a32b6472924f81a8f5a6c9c76fc4d5b769cea550231a768186dd60893a0d" gracePeriod=30 Sep 30 13:55:20 crc kubenswrapper[4763]: I0930 13:55:20.335082 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-689f4d67f6-55mbs" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerName="barbican-api" containerID="cri-o://759be06ade67c37c53371021ab1577180afe88e157c3115d7ac4ebac302afade" gracePeriod=30 Sep 30 13:55:20 crc kubenswrapper[4763]: I0930 13:55:20.510032 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6386172b-d1f8-4dd4-897e-cd58a6acf678" path="/var/lib/kubelet/pods/6386172b-d1f8-4dd4-897e-cd58a6acf678/volumes" Sep 30 13:55:20 crc kubenswrapper[4763]: I0930 13:55:20.756652 4763 generic.go:334] "Generic (PLEG): container finished" podID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerID="cea0a32b6472924f81a8f5a6c9c76fc4d5b769cea550231a768186dd60893a0d" exitCode=143 Sep 30 13:55:20 crc kubenswrapper[4763]: I0930 13:55:20.757476 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-689f4d67f6-55mbs" event={"ID":"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9","Type":"ContainerDied","Data":"cea0a32b6472924f81a8f5a6c9c76fc4d5b769cea550231a768186dd60893a0d"} Sep 30 13:55:21 crc kubenswrapper[4763]: I0930 13:55:21.771206 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:21 crc kubenswrapper[4763]: I0930 13:55:21.774243 4763 generic.go:334] "Generic (PLEG): container finished" podID="e000c274-a7a0-493f-a0ea-537e5c474cb0" containerID="1e418fd879a76bc974ec3882d16798e171a7acc9a5c7ba9107b332d8d6aea0fc" exitCode=0 Sep 30 13:55:21 crc kubenswrapper[4763]: I0930 13:55:21.774276 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjgmb" event={"ID":"e000c274-a7a0-493f-a0ea-537e5c474cb0","Type":"ContainerDied","Data":"1e418fd879a76bc974ec3882d16798e171a7acc9a5c7ba9107b332d8d6aea0fc"} Sep 30 13:55:23 crc kubenswrapper[4763]: I0930 13:55:23.792518 4763 generic.go:334] "Generic (PLEG): container finished" podID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerID="759be06ade67c37c53371021ab1577180afe88e157c3115d7ac4ebac302afade" exitCode=0 Sep 30 13:55:23 crc kubenswrapper[4763]: I0930 13:55:23.792700 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-689f4d67f6-55mbs" event={"ID":"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9","Type":"ContainerDied","Data":"759be06ade67c37c53371021ab1577180afe88e157c3115d7ac4ebac302afade"} Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.248051 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.248107 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.290003 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.291719 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.592730 4763 scope.go:117] "RemoveContainer" containerID="e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.776250 4763 scope.go:117] "RemoveContainer" containerID="8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927" Sep 30 13:55:24 crc kubenswrapper[4763]: E0930 13:55:24.776738 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927\": container with ID starting with 8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927 not found: ID does not exist" containerID="8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.776776 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927"} err="failed to get container status \"8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927\": rpc error: code = NotFound desc = could not find container \"8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927\": container with ID starting with 8bd0f224d72af117d1886e4bfdf0f506ca5ef38ac3004afdf43142639207d927 not found: ID does not exist" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.776803 4763 scope.go:117] "RemoveContainer" containerID="e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815" Sep 30 13:55:24 crc kubenswrapper[4763]: E0930 13:55:24.777169 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815\": container with ID starting with e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815 not found: ID does not exist" containerID="e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.777194 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815"} err="failed to get container status \"e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815\": rpc error: code = NotFound desc = could not find container \"e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815\": container with ID starting with e94d1c490070ecaf5fc648d977268a66e266031b9689e3828806710169bd9815 not found: ID does not exist" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.808585 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xmvws" event={"ID":"96adbfe1-e6f8-4460-b999-a213cb396c4b","Type":"ContainerDied","Data":"78b6941f5e8bb3b06e8fbfcbabf262a7ed631124053f1081a46c4b56fe51e8a4"} Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.808712 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78b6941f5e8bb3b06e8fbfcbabf262a7ed631124053f1081a46c4b56fe51e8a4" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.812552 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjgmb" event={"ID":"e000c274-a7a0-493f-a0ea-537e5c474cb0","Type":"ContainerDied","Data":"f2e3d805f91ff3eb424ecae60705132d2fb21a2b0a675c59cf1cb2ef1316f545"} Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.812583 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2e3d805f91ff3eb424ecae60705132d2fb21a2b0a675c59cf1cb2ef1316f545" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.812617 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.813177 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.944021 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xmvws" Sep 30 13:55:24 crc kubenswrapper[4763]: I0930 13:55:24.989142 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.007836 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.095573 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv2r6\" (UniqueName: \"kubernetes.io/projected/e000c274-a7a0-493f-a0ea-537e5c474cb0-kube-api-access-hv2r6\") pod \"e000c274-a7a0-493f-a0ea-537e5c474cb0\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.095633 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-config\") pod \"96adbfe1-e6f8-4460-b999-a213cb396c4b\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.095676 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-scripts\") pod \"e000c274-a7a0-493f-a0ea-537e5c474cb0\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.095722 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-combined-ca-bundle\") pod \"96adbfe1-e6f8-4460-b999-a213cb396c4b\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.095750 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-combined-ca-bundle\") pod \"e000c274-a7a0-493f-a0ea-537e5c474cb0\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.095786 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e000c274-a7a0-493f-a0ea-537e5c474cb0-etc-machine-id\") pod \"e000c274-a7a0-493f-a0ea-537e5c474cb0\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.095817 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgmxt\" (UniqueName: \"kubernetes.io/projected/96adbfe1-e6f8-4460-b999-a213cb396c4b-kube-api-access-jgmxt\") pod \"96adbfe1-e6f8-4460-b999-a213cb396c4b\" (UID: \"96adbfe1-e6f8-4460-b999-a213cb396c4b\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.095888 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-config-data\") pod \"e000c274-a7a0-493f-a0ea-537e5c474cb0\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.095923 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-db-sync-config-data\") pod \"e000c274-a7a0-493f-a0ea-537e5c474cb0\" (UID: \"e000c274-a7a0-493f-a0ea-537e5c474cb0\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.096492 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e000c274-a7a0-493f-a0ea-537e5c474cb0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e000c274-a7a0-493f-a0ea-537e5c474cb0" (UID: "e000c274-a7a0-493f-a0ea-537e5c474cb0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.099256 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.099300 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.104631 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-scripts" (OuterVolumeSpecName: "scripts") pod "e000c274-a7a0-493f-a0ea-537e5c474cb0" (UID: "e000c274-a7a0-493f-a0ea-537e5c474cb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.112928 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96adbfe1-e6f8-4460-b999-a213cb396c4b-kube-api-access-jgmxt" (OuterVolumeSpecName: "kube-api-access-jgmxt") pod "96adbfe1-e6f8-4460-b999-a213cb396c4b" (UID: "96adbfe1-e6f8-4460-b999-a213cb396c4b"). InnerVolumeSpecName "kube-api-access-jgmxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.113680 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e000c274-a7a0-493f-a0ea-537e5c474cb0" (UID: "e000c274-a7a0-493f-a0ea-537e5c474cb0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.115472 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e000c274-a7a0-493f-a0ea-537e5c474cb0-kube-api-access-hv2r6" (OuterVolumeSpecName: "kube-api-access-hv2r6") pod "e000c274-a7a0-493f-a0ea-537e5c474cb0" (UID: "e000c274-a7a0-493f-a0ea-537e5c474cb0"). InnerVolumeSpecName "kube-api-access-hv2r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.132725 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e000c274-a7a0-493f-a0ea-537e5c474cb0" (UID: "e000c274-a7a0-493f-a0ea-537e5c474cb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.136697 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96adbfe1-e6f8-4460-b999-a213cb396c4b" (UID: "96adbfe1-e6f8-4460-b999-a213cb396c4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.140085 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.142774 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-config" (OuterVolumeSpecName: "config") pod "96adbfe1-e6f8-4460-b999-a213cb396c4b" (UID: "96adbfe1-e6f8-4460-b999-a213cb396c4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.151468 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.163119 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-config-data" (OuterVolumeSpecName: "config-data") pod "e000c274-a7a0-493f-a0ea-537e5c474cb0" (UID: "e000c274-a7a0-493f-a0ea-537e5c474cb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.196999 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-logs\") pod \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.197072 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data\") pod \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.197124 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-combined-ca-bundle\") pod \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.197666 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-logs" (OuterVolumeSpecName: "logs") pod "a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" (UID: "a87af91b-c71e-4d5e-a7d2-10fa502a6dc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.197981 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j47xn\" (UniqueName: \"kubernetes.io/projected/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-kube-api-access-j47xn\") pod \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.198027 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data-custom\") pod \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\" (UID: \"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9\") " Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.198692 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.198710 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.198721 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv2r6\" (UniqueName: \"kubernetes.io/projected/e000c274-a7a0-493f-a0ea-537e5c474cb0-kube-api-access-hv2r6\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.198730 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.198738 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.198747 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96adbfe1-e6f8-4460-b999-a213cb396c4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.199036 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e000c274-a7a0-493f-a0ea-537e5c474cb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.199464 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e000c274-a7a0-493f-a0ea-537e5c474cb0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.199564 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgmxt\" (UniqueName: \"kubernetes.io/projected/96adbfe1-e6f8-4460-b999-a213cb396c4b-kube-api-access-jgmxt\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.199644 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.202300 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" (UID: "a87af91b-c71e-4d5e-a7d2-10fa502a6dc9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.203056 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-kube-api-access-j47xn" (OuterVolumeSpecName: "kube-api-access-j47xn") pod "a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" (UID: "a87af91b-c71e-4d5e-a7d2-10fa502a6dc9"). InnerVolumeSpecName "kube-api-access-j47xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.220385 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" (UID: "a87af91b-c71e-4d5e-a7d2-10fa502a6dc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.249092 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data" (OuterVolumeSpecName: "config-data") pod "a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" (UID: "a87af91b-c71e-4d5e-a7d2-10fa502a6dc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.301675 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j47xn\" (UniqueName: \"kubernetes.io/projected/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-kube-api-access-j47xn\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.301708 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.301719 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.301729 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.623388 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-bj78g"] Sep 30 13:55:25 crc kubenswrapper[4763]: E0930 13:55:25.624038 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6386172b-d1f8-4dd4-897e-cd58a6acf678" containerName="dnsmasq-dns" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624060 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6386172b-d1f8-4dd4-897e-cd58a6acf678" containerName="dnsmasq-dns" Sep 30 13:55:25 crc kubenswrapper[4763]: E0930 13:55:25.624077 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerName="barbican-api-log" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624084 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerName="barbican-api-log" Sep 30 13:55:25 crc kubenswrapper[4763]: E0930 13:55:25.624093 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e000c274-a7a0-493f-a0ea-537e5c474cb0" containerName="cinder-db-sync" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624101 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e000c274-a7a0-493f-a0ea-537e5c474cb0" containerName="cinder-db-sync" Sep 30 13:55:25 crc kubenswrapper[4763]: E0930 13:55:25.624125 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6386172b-d1f8-4dd4-897e-cd58a6acf678" containerName="init" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624133 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6386172b-d1f8-4dd4-897e-cd58a6acf678" containerName="init" Sep 30 13:55:25 crc kubenswrapper[4763]: E0930 13:55:25.624157 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerName="barbican-api" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624165 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerName="barbican-api" Sep 30 13:55:25 crc kubenswrapper[4763]: E0930 13:55:25.624178 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96adbfe1-e6f8-4460-b999-a213cb396c4b" containerName="neutron-db-sync" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624186 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="96adbfe1-e6f8-4460-b999-a213cb396c4b" containerName="neutron-db-sync" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624354 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerName="barbican-api-log" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624371 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6386172b-d1f8-4dd4-897e-cd58a6acf678" containerName="dnsmasq-dns" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624385 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e000c274-a7a0-493f-a0ea-537e5c474cb0" containerName="cinder-db-sync" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624398 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerName="barbican-api" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.624428 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="96adbfe1-e6f8-4460-b999-a213cb396c4b" containerName="neutron-db-sync" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.625006 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bj78g" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.646081 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bj78g"] Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.689530 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-xxzdg"] Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.690999 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xxzdg" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.702786 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xxzdg"] Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.806101 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-x49w7"] Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.807246 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x49w7" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.814224 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-x49w7"] Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.815639 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5m5\" (UniqueName: \"kubernetes.io/projected/01d448d7-005e-443a-9931-01565aa7a5f1-kube-api-access-pw5m5\") pod \"nova-api-db-create-bj78g\" (UID: \"01d448d7-005e-443a-9931-01565aa7a5f1\") " pod="openstack/nova-api-db-create-bj78g" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.815778 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q28w9\" (UniqueName: \"kubernetes.io/projected/f1296f29-0be2-4be4-bb9e-3670307d9d05-kube-api-access-q28w9\") pod \"nova-cell0-db-create-xxzdg\" (UID: \"f1296f29-0be2-4be4-bb9e-3670307d9d05\") " pod="openstack/nova-cell0-db-create-xxzdg" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.846524 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-689f4d67f6-55mbs" event={"ID":"a87af91b-c71e-4d5e-a7d2-10fa502a6dc9","Type":"ContainerDied","Data":"933d9faf7546ba20cb10b6610fb80e7771030beb232c6a0b4ef183f67691dcda"} Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.846550 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-689f4d67f6-55mbs" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.846593 4763 scope.go:117] "RemoveContainer" containerID="759be06ade67c37c53371021ab1577180afe88e157c3115d7ac4ebac302afade" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.849815 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3acbf6d3-0af6-49df-9884-7a79660c0d38","Type":"ContainerStarted","Data":"e78060f272ffda4310640310e6334115da0c0d71f1473241a98c76d12624ed0a"} Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.849848 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3acbf6d3-0af6-49df-9884-7a79660c0d38","Type":"ContainerStarted","Data":"b67209a39cc76a2d09a6f6508e0d53b09f7963f481bceb2a396c9759d3019c5a"} Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.851172 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xmvws" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.851500 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"98e98c9d-b727-4c5b-857b-13064b0ef92f","Type":"ContainerStarted","Data":"e4eaf436f4bb9a039d79a107f25f1b38d03cc925e22d803d54fdd98495213540"} Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.851554 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjgmb" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.853058 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.853075 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.880853 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.035698478 podStartE2EDuration="20.880827859s" podCreationTimestamp="2025-09-30 13:55:05 +0000 UTC" firstStartedPulling="2025-09-30 13:55:06.783672636 +0000 UTC m=+1178.922232921" lastFinishedPulling="2025-09-30 13:55:24.628802017 +0000 UTC m=+1196.767362302" observedRunningTime="2025-09-30 13:55:25.873402592 +0000 UTC m=+1198.011962897" watchObservedRunningTime="2025-09-30 13:55:25.880827859 +0000 UTC m=+1198.019388154" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.882878 4763 scope.go:117] "RemoveContainer" containerID="cea0a32b6472924f81a8f5a6c9c76fc4d5b769cea550231a768186dd60893a0d" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.903289 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-689f4d67f6-55mbs"] Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.912114 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-689f4d67f6-55mbs"] Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.944174 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5m5\" (UniqueName: \"kubernetes.io/projected/01d448d7-005e-443a-9931-01565aa7a5f1-kube-api-access-pw5m5\") pod \"nova-api-db-create-bj78g\" (UID: \"01d448d7-005e-443a-9931-01565aa7a5f1\") " pod="openstack/nova-api-db-create-bj78g" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.944278 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q28w9\" (UniqueName: \"kubernetes.io/projected/f1296f29-0be2-4be4-bb9e-3670307d9d05-kube-api-access-q28w9\") pod \"nova-cell0-db-create-xxzdg\" (UID: \"f1296f29-0be2-4be4-bb9e-3670307d9d05\") " pod="openstack/nova-cell0-db-create-xxzdg" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.944300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w2t4\" (UniqueName: \"kubernetes.io/projected/9793edca-7ca4-4ccc-8448-42b6897bb3b9-kube-api-access-2w2t4\") pod \"nova-cell1-db-create-x49w7\" (UID: \"9793edca-7ca4-4ccc-8448-42b6897bb3b9\") " pod="openstack/nova-cell1-db-create-x49w7" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.965343 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q28w9\" (UniqueName: \"kubernetes.io/projected/f1296f29-0be2-4be4-bb9e-3670307d9d05-kube-api-access-q28w9\") pod \"nova-cell0-db-create-xxzdg\" (UID: \"f1296f29-0be2-4be4-bb9e-3670307d9d05\") " pod="openstack/nova-cell0-db-create-xxzdg" Sep 30 13:55:25 crc kubenswrapper[4763]: I0930 13:55:25.978371 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5m5\" (UniqueName: \"kubernetes.io/projected/01d448d7-005e-443a-9931-01565aa7a5f1-kube-api-access-pw5m5\") pod \"nova-api-db-create-bj78g\" (UID: \"01d448d7-005e-443a-9931-01565aa7a5f1\") " pod="openstack/nova-api-db-create-bj78g" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.012091 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bj78g" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.036576 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xxzdg" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.047083 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w2t4\" (UniqueName: \"kubernetes.io/projected/9793edca-7ca4-4ccc-8448-42b6897bb3b9-kube-api-access-2w2t4\") pod \"nova-cell1-db-create-x49w7\" (UID: \"9793edca-7ca4-4ccc-8448-42b6897bb3b9\") " pod="openstack/nova-cell1-db-create-x49w7" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.063877 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w2t4\" (UniqueName: \"kubernetes.io/projected/9793edca-7ca4-4ccc-8448-42b6897bb3b9-kube-api-access-2w2t4\") pod \"nova-cell1-db-create-x49w7\" (UID: \"9793edca-7ca4-4ccc-8448-42b6897bb3b9\") " pod="openstack/nova-cell1-db-create-x49w7" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.124106 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x49w7" Sep 30 13:55:26 crc kubenswrapper[4763]: E0930 13:55:26.194392 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96adbfe1_e6f8_4460_b999_a213cb396c4b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode000c274_a7a0_493f_a0ea_537e5c474cb0.slice/crio-f2e3d805f91ff3eb424ecae60705132d2fb21a2b0a675c59cf1cb2ef1316f545\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda87af91b_c71e_4d5e_a7d2_10fa502a6dc9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode000c274_a7a0_493f_a0ea_537e5c474cb0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96adbfe1_e6f8_4460_b999_a213cb396c4b.slice/crio-78b6941f5e8bb3b06e8fbfcbabf262a7ed631124053f1081a46c4b56fe51e8a4\": RecentStats: unable to find data in memory cache]" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.239401 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-k2hgl"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.242610 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.294879 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-k2hgl"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.321743 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-576df9b9d8-5btc5"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.323197 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.326544 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.326824 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.330657 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bf5w6" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.330825 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.349771 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.351227 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.357243 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.357522 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nwkzr" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.358518 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.359173 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.365592 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-nb\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.365640 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llzgs\" (UniqueName: \"kubernetes.io/projected/ca9a201f-15c8-42e6-8e65-601382dd2c39-kube-api-access-llzgs\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.365676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-sb\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.365746 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-swift-storage-0\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.365781 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-svc\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.365859 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-config\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.376742 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-576df9b9d8-5btc5"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.385679 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.469494 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.469533 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-httpd-config\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.469560 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-swift-storage-0\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.469580 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.469623 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-svc\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.469665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdxpf\" (UniqueName: \"kubernetes.io/projected/4ddf1156-f78b-43ce-bca1-44c026df8262-kube-api-access-hdxpf\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.469694 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.469713 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ddf1156-f78b-43ce-bca1-44c026df8262-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.470856 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-swift-storage-0\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.470943 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26sn2\" (UniqueName: \"kubernetes.io/projected/fc3e6347-c27f-4249-a1b7-145165c06d70-kube-api-access-26sn2\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.470973 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-config\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.470990 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-config\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.471005 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-ovndb-tls-certs\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.471029 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-nb\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.471058 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llzgs\" (UniqueName: \"kubernetes.io/projected/ca9a201f-15c8-42e6-8e65-601382dd2c39-kube-api-access-llzgs\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.471084 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-scripts\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.471103 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-sb\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.471148 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-combined-ca-bundle\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.471707 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-config\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.471831 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-svc\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.472524 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-nb\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.472746 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-sb\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.482817 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-k2hgl"] Sep 30 13:55:26 crc kubenswrapper[4763]: E0930 13:55:26.483447 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-llzgs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" podUID="ca9a201f-15c8-42e6-8e65-601382dd2c39" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.512916 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llzgs\" (UniqueName: \"kubernetes.io/projected/ca9a201f-15c8-42e6-8e65-601382dd2c39-kube-api-access-llzgs\") pod \"dnsmasq-dns-548b47b48c-k2hgl\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.534854 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" path="/var/lib/kubelet/pods/a87af91b-c71e-4d5e-a7d2-10fa502a6dc9/volumes" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.535410 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-gc7p2"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.539785 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.562268 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-gc7p2"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.581829 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-ovndb-tls-certs\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.581878 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-config\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.581936 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-scripts\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.581985 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-nb\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582057 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bjx5\" (UniqueName: \"kubernetes.io/projected/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-kube-api-access-5bjx5\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582087 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-config\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-combined-ca-bundle\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582160 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-sb\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582185 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-httpd-config\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582265 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-swift-storage-0\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582303 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582355 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdxpf\" (UniqueName: \"kubernetes.io/projected/4ddf1156-f78b-43ce-bca1-44c026df8262-kube-api-access-hdxpf\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582389 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-svc\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582432 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582459 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ddf1156-f78b-43ce-bca1-44c026df8262-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.582521 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26sn2\" (UniqueName: \"kubernetes.io/projected/fc3e6347-c27f-4249-a1b7-145165c06d70-kube-api-access-26sn2\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.593665 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-ovndb-tls-certs\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.600072 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.600970 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ddf1156-f78b-43ce-bca1-44c026df8262-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.612139 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.616438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-combined-ca-bundle\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.621353 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.628938 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-config\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.629305 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-httpd-config\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.637042 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-scripts\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.637699 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26sn2\" (UniqueName: \"kubernetes.io/projected/fc3e6347-c27f-4249-a1b7-145165c06d70-kube-api-access-26sn2\") pod \"neutron-576df9b9d8-5btc5\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.660718 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdxpf\" (UniqueName: \"kubernetes.io/projected/4ddf1156-f78b-43ce-bca1-44c026df8262-kube-api-access-hdxpf\") pod \"cinder-scheduler-0\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.680318 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.682185 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.684792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-nb\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.684860 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bjx5\" (UniqueName: \"kubernetes.io/projected/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-kube-api-access-5bjx5\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.684888 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-config\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.684914 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-sb\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.684950 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-swift-storage-0\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.684997 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-svc\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.685958 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-svc\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.686517 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-nb\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.687348 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-config\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.690646 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.691762 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-swift-storage-0\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.704121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-sb\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.706292 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.717315 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bjx5\" (UniqueName: \"kubernetes.io/projected/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-kube-api-access-5bjx5\") pod \"dnsmasq-dns-6c47bb5d77-gc7p2\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.753855 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.757868 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.772176 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.786950 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9f85e26-3d6b-4cd4-ae18-78d136626e63-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.786997 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f85e26-3d6b-4cd4-ae18-78d136626e63-logs\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.787110 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.787149 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v85qt\" (UniqueName: \"kubernetes.io/projected/a9f85e26-3d6b-4cd4-ae18-78d136626e63-kube-api-access-v85qt\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.787191 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.787206 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-scripts\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.787238 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data-custom\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.886444 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bj78g"] Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.888145 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9f85e26-3d6b-4cd4-ae18-78d136626e63-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.888191 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f85e26-3d6b-4cd4-ae18-78d136626e63-logs\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.888280 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.888322 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v85qt\" (UniqueName: \"kubernetes.io/projected/a9f85e26-3d6b-4cd4-ae18-78d136626e63-kube-api-access-v85qt\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.888359 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9f85e26-3d6b-4cd4-ae18-78d136626e63-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.888379 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.888419 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-scripts\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.888461 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data-custom\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.888874 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f85e26-3d6b-4cd4-ae18-78d136626e63-logs\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.896839 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-scripts\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.907167 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.908291 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data-custom\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.918697 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.921876 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v85qt\" (UniqueName: \"kubernetes.io/projected/a9f85e26-3d6b-4cd4-ae18-78d136626e63-kube-api-access-v85qt\") pod \"cinder-api-0\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " pod="openstack/cinder-api-0" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.942000 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:26 crc kubenswrapper[4763]: I0930 13:55:26.977594 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.083213 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xxzdg"] Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.093230 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llzgs\" (UniqueName: \"kubernetes.io/projected/ca9a201f-15c8-42e6-8e65-601382dd2c39-kube-api-access-llzgs\") pod \"ca9a201f-15c8-42e6-8e65-601382dd2c39\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.093331 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-sb\") pod \"ca9a201f-15c8-42e6-8e65-601382dd2c39\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.093489 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-swift-storage-0\") pod \"ca9a201f-15c8-42e6-8e65-601382dd2c39\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.093631 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-config\") pod \"ca9a201f-15c8-42e6-8e65-601382dd2c39\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.093738 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-nb\") pod \"ca9a201f-15c8-42e6-8e65-601382dd2c39\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.093816 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-svc\") pod \"ca9a201f-15c8-42e6-8e65-601382dd2c39\" (UID: \"ca9a201f-15c8-42e6-8e65-601382dd2c39\") " Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.095671 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ca9a201f-15c8-42e6-8e65-601382dd2c39" (UID: "ca9a201f-15c8-42e6-8e65-601382dd2c39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.095856 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-config" (OuterVolumeSpecName: "config") pod "ca9a201f-15c8-42e6-8e65-601382dd2c39" (UID: "ca9a201f-15c8-42e6-8e65-601382dd2c39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.098265 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca9a201f-15c8-42e6-8e65-601382dd2c39" (UID: "ca9a201f-15c8-42e6-8e65-601382dd2c39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.098670 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ca9a201f-15c8-42e6-8e65-601382dd2c39" (UID: "ca9a201f-15c8-42e6-8e65-601382dd2c39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.102869 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ca9a201f-15c8-42e6-8e65-601382dd2c39" (UID: "ca9a201f-15c8-42e6-8e65-601382dd2c39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.108280 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9a201f-15c8-42e6-8e65-601382dd2c39-kube-api-access-llzgs" (OuterVolumeSpecName: "kube-api-access-llzgs") pod "ca9a201f-15c8-42e6-8e65-601382dd2c39" (UID: "ca9a201f-15c8-42e6-8e65-601382dd2c39"). InnerVolumeSpecName "kube-api-access-llzgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.160642 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.196263 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.196299 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.196317 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.196333 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llzgs\" (UniqueName: \"kubernetes.io/projected/ca9a201f-15c8-42e6-8e65-601382dd2c39-kube-api-access-llzgs\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.196345 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.196353 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca9a201f-15c8-42e6-8e65-601382dd2c39-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.306758 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-x49w7"] Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.623492 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-gc7p2"] Sep 30 13:55:27 crc kubenswrapper[4763]: W0930 13:55:27.637923 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f0c4ebb_7bf6_43f3_8880_8e4e9f9874b0.slice/crio-8f95e7762a85d9259af0e728e2135623de9492235fcb0b05213d5812754912d9 WatchSource:0}: Error finding container 8f95e7762a85d9259af0e728e2135623de9492235fcb0b05213d5812754912d9: Status 404 returned error can't find the container with id 8f95e7762a85d9259af0e728e2135623de9492235fcb0b05213d5812754912d9 Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.737971 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-576df9b9d8-5btc5"] Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.793144 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.960452 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4ddf1156-f78b-43ce-bca1-44c026df8262","Type":"ContainerStarted","Data":"56e2bc3916f11b1951ac866a737a94f7758d437d6490680a0a00230c5710bfa7"} Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.962185 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:55:27 crc kubenswrapper[4763]: W0930 13:55:27.968341 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9f85e26_3d6b_4cd4_ae18_78d136626e63.slice/crio-db3eeb2c273f207f91e0dec0d34b5d5002b37541ba0e74f89c11bab0c3996eda WatchSource:0}: Error finding container db3eeb2c273f207f91e0dec0d34b5d5002b37541ba0e74f89c11bab0c3996eda: Status 404 returned error can't find the container with id db3eeb2c273f207f91e0dec0d34b5d5002b37541ba0e74f89c11bab0c3996eda Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.970459 4763 generic.go:334] "Generic (PLEG): container finished" podID="f1296f29-0be2-4be4-bb9e-3670307d9d05" containerID="20139e3c78bc0baaef93ec4b062142a66e7644706689edf7d66fcdcdfa1ee54b" exitCode=0 Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.970529 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xxzdg" event={"ID":"f1296f29-0be2-4be4-bb9e-3670307d9d05","Type":"ContainerDied","Data":"20139e3c78bc0baaef93ec4b062142a66e7644706689edf7d66fcdcdfa1ee54b"} Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.970559 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xxzdg" event={"ID":"f1296f29-0be2-4be4-bb9e-3670307d9d05","Type":"ContainerStarted","Data":"0fdbede4385116ae79e0a79023900d17be6fb8c4e8f60ac10644c3bc8172a480"} Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.972550 4763 generic.go:334] "Generic (PLEG): container finished" podID="01d448d7-005e-443a-9931-01565aa7a5f1" containerID="a33f1d59438472e65bd0b823acb92857152c410b2221a830aec663d9df6536d7" exitCode=0 Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.972641 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bj78g" event={"ID":"01d448d7-005e-443a-9931-01565aa7a5f1","Type":"ContainerDied","Data":"a33f1d59438472e65bd0b823acb92857152c410b2221a830aec663d9df6536d7"} Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.972672 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bj78g" event={"ID":"01d448d7-005e-443a-9931-01565aa7a5f1","Type":"ContainerStarted","Data":"4f2107720801f5ec5686f4530bc5b7377aa18b062eb0761f055379e17ff887f7"} Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.989249 4763 generic.go:334] "Generic (PLEG): container finished" podID="9793edca-7ca4-4ccc-8448-42b6897bb3b9" containerID="075a46a3aaa42f1f7b706ec297479cc87c77334287a9c8d5cabb52aed65fd99f" exitCode=0 Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.989327 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-x49w7" event={"ID":"9793edca-7ca4-4ccc-8448-42b6897bb3b9","Type":"ContainerDied","Data":"075a46a3aaa42f1f7b706ec297479cc87c77334287a9c8d5cabb52aed65fd99f"} Sep 30 13:55:27 crc kubenswrapper[4763]: I0930 13:55:27.989524 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-x49w7" event={"ID":"9793edca-7ca4-4ccc-8448-42b6897bb3b9","Type":"ContainerStarted","Data":"9e75f50f59299a66659cf3c17a0baae247277ed3739363ebd3095ab6b7d9f185"} Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.013104 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" event={"ID":"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0","Type":"ContainerStarted","Data":"8f95e7762a85d9259af0e728e2135623de9492235fcb0b05213d5812754912d9"} Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.021774 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576df9b9d8-5btc5" event={"ID":"fc3e6347-c27f-4249-a1b7-145165c06d70","Type":"ContainerStarted","Data":"e5e7729ce3ef860ab0eca1555d7259d6a5075e8f2fe0fe815a41108915f6d066"} Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.049838 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.049867 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.049909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3acbf6d3-0af6-49df-9884-7a79660c0d38","Type":"ContainerStarted","Data":"a418a32191c8b2605067051c33630bba5162e74fdb861a2ed31ba8b9662ffeeb"} Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.050061 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-548b47b48c-k2hgl" Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.127700 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-k2hgl"] Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.143444 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.149639 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-k2hgl"] Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.150464 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.153142 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:28 crc kubenswrapper[4763]: I0930 13:55:28.519128 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca9a201f-15c8-42e6-8e65-601382dd2c39" path="/var/lib/kubelet/pods/ca9a201f-15c8-42e6-8e65-601382dd2c39/volumes" Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.087710 4763 generic.go:334] "Generic (PLEG): container finished" podID="7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" containerID="3bb6c1094febeddb8fff27dc43669fa2c2470b5e86ec7be0f01dc90f42dfb92b" exitCode=0 Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.088187 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" event={"ID":"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0","Type":"ContainerDied","Data":"3bb6c1094febeddb8fff27dc43669fa2c2470b5e86ec7be0f01dc90f42dfb92b"} Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.132094 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576df9b9d8-5btc5" event={"ID":"fc3e6347-c27f-4249-a1b7-145165c06d70","Type":"ContainerStarted","Data":"b854e79c47c6f6e756da69e22feb04c256f5f94040230f706fcdea5ac2ac8dc0"} Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.132135 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576df9b9d8-5btc5" event={"ID":"fc3e6347-c27f-4249-a1b7-145165c06d70","Type":"ContainerStarted","Data":"4774b3f6ac7b157b234c17b092bac7e0ad012e21d52b9da28446843481c35238"} Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.132790 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.143170 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3acbf6d3-0af6-49df-9884-7a79660c0d38","Type":"ContainerStarted","Data":"4ac43ed1699efe54eac9b5d01a220fd631b9d25fdf740d9d8778c71039c57bc6"} Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.143329 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="ceilometer-central-agent" containerID="cri-o://b67209a39cc76a2d09a6f6508e0d53b09f7963f481bceb2a396c9759d3019c5a" gracePeriod=30 Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.143613 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.143662 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="proxy-httpd" containerID="cri-o://4ac43ed1699efe54eac9b5d01a220fd631b9d25fdf740d9d8778c71039c57bc6" gracePeriod=30 Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.143705 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="sg-core" containerID="cri-o://a418a32191c8b2605067051c33630bba5162e74fdb861a2ed31ba8b9662ffeeb" gracePeriod=30 Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.143756 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="ceilometer-notification-agent" containerID="cri-o://e78060f272ffda4310640310e6334115da0c0d71f1473241a98c76d12624ed0a" gracePeriod=30 Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.192932 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a9f85e26-3d6b-4cd4-ae18-78d136626e63","Type":"ContainerStarted","Data":"db3eeb2c273f207f91e0dec0d34b5d5002b37541ba0e74f89c11bab0c3996eda"} Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.197495 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.883761003 podStartE2EDuration="15.197484535s" podCreationTimestamp="2025-09-30 13:55:14 +0000 UTC" firstStartedPulling="2025-09-30 13:55:15.957368044 +0000 UTC m=+1188.095928329" lastFinishedPulling="2025-09-30 13:55:28.271091576 +0000 UTC m=+1200.409651861" observedRunningTime="2025-09-30 13:55:29.195063925 +0000 UTC m=+1201.333624220" watchObservedRunningTime="2025-09-30 13:55:29.197484535 +0000 UTC m=+1201.336044820" Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.198392 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-576df9b9d8-5btc5" podStartSLOduration=3.198388968 podStartE2EDuration="3.198388968s" podCreationTimestamp="2025-09-30 13:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:29.16822344 +0000 UTC m=+1201.306783725" watchObservedRunningTime="2025-09-30 13:55:29.198388968 +0000 UTC m=+1201.336949253" Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.510161 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.510553 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.723939 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.779162 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.815934 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-689f4d67f6-55mbs" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:55:29 crc kubenswrapper[4763]: I0930 13:55:29.820008 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-689f4d67f6-55mbs" podUID="a87af91b-c71e-4d5e-a7d2-10fa502a6dc9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.082040 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xxzdg" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.123982 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bj78g" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.151013 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x49w7" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.199908 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q28w9\" (UniqueName: \"kubernetes.io/projected/f1296f29-0be2-4be4-bb9e-3670307d9d05-kube-api-access-q28w9\") pod \"f1296f29-0be2-4be4-bb9e-3670307d9d05\" (UID: \"f1296f29-0be2-4be4-bb9e-3670307d9d05\") " Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.200122 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw5m5\" (UniqueName: \"kubernetes.io/projected/01d448d7-005e-443a-9931-01565aa7a5f1-kube-api-access-pw5m5\") pod \"01d448d7-005e-443a-9931-01565aa7a5f1\" (UID: \"01d448d7-005e-443a-9931-01565aa7a5f1\") " Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.200214 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w2t4\" (UniqueName: \"kubernetes.io/projected/9793edca-7ca4-4ccc-8448-42b6897bb3b9-kube-api-access-2w2t4\") pod \"9793edca-7ca4-4ccc-8448-42b6897bb3b9\" (UID: \"9793edca-7ca4-4ccc-8448-42b6897bb3b9\") " Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.218770 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9793edca-7ca4-4ccc-8448-42b6897bb3b9-kube-api-access-2w2t4" (OuterVolumeSpecName: "kube-api-access-2w2t4") pod "9793edca-7ca4-4ccc-8448-42b6897bb3b9" (UID: "9793edca-7ca4-4ccc-8448-42b6897bb3b9"). InnerVolumeSpecName "kube-api-access-2w2t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.225246 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1296f29-0be2-4be4-bb9e-3670307d9d05-kube-api-access-q28w9" (OuterVolumeSpecName: "kube-api-access-q28w9") pod "f1296f29-0be2-4be4-bb9e-3670307d9d05" (UID: "f1296f29-0be2-4be4-bb9e-3670307d9d05"). InnerVolumeSpecName "kube-api-access-q28w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.227106 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d448d7-005e-443a-9931-01565aa7a5f1-kube-api-access-pw5m5" (OuterVolumeSpecName: "kube-api-access-pw5m5") pod "01d448d7-005e-443a-9931-01565aa7a5f1" (UID: "01d448d7-005e-443a-9931-01565aa7a5f1"). InnerVolumeSpecName "kube-api-access-pw5m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.236829 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4ddf1156-f78b-43ce-bca1-44c026df8262","Type":"ContainerStarted","Data":"7bd3b941ddbec803ad6bb4b3d8baac0714afacce2bcc227f78f81e9bdaf2c7dc"} Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.242303 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xxzdg" event={"ID":"f1296f29-0be2-4be4-bb9e-3670307d9d05","Type":"ContainerDied","Data":"0fdbede4385116ae79e0a79023900d17be6fb8c4e8f60ac10644c3bc8172a480"} Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.242345 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fdbede4385116ae79e0a79023900d17be6fb8c4e8f60ac10644c3bc8172a480" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.242409 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xxzdg" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.258084 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-x49w7" event={"ID":"9793edca-7ca4-4ccc-8448-42b6897bb3b9","Type":"ContainerDied","Data":"9e75f50f59299a66659cf3c17a0baae247277ed3739363ebd3095ab6b7d9f185"} Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.258127 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e75f50f59299a66659cf3c17a0baae247277ed3739363ebd3095ab6b7d9f185" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.258201 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x49w7" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.285111 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" event={"ID":"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0","Type":"ContainerStarted","Data":"011c47ce23ad6f62f6305136adfd0a8f49ce729205d0d49b56f81c72fd11eaf4"} Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.285770 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.304880 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q28w9\" (UniqueName: \"kubernetes.io/projected/f1296f29-0be2-4be4-bb9e-3670307d9d05-kube-api-access-q28w9\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.304902 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw5m5\" (UniqueName: \"kubernetes.io/projected/01d448d7-005e-443a-9931-01565aa7a5f1-kube-api-access-pw5m5\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.304912 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w2t4\" (UniqueName: \"kubernetes.io/projected/9793edca-7ca4-4ccc-8448-42b6897bb3b9-kube-api-access-2w2t4\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.318239 4763 generic.go:334] "Generic (PLEG): container finished" podID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerID="4ac43ed1699efe54eac9b5d01a220fd631b9d25fdf740d9d8778c71039c57bc6" exitCode=0 Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.318300 4763 generic.go:334] "Generic (PLEG): container finished" podID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerID="a418a32191c8b2605067051c33630bba5162e74fdb861a2ed31ba8b9662ffeeb" exitCode=2 Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.318313 4763 generic.go:334] "Generic (PLEG): container finished" podID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerID="e78060f272ffda4310640310e6334115da0c0d71f1473241a98c76d12624ed0a" exitCode=0 Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.318386 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3acbf6d3-0af6-49df-9884-7a79660c0d38","Type":"ContainerDied","Data":"4ac43ed1699efe54eac9b5d01a220fd631b9d25fdf740d9d8778c71039c57bc6"} Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.318419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3acbf6d3-0af6-49df-9884-7a79660c0d38","Type":"ContainerDied","Data":"a418a32191c8b2605067051c33630bba5162e74fdb861a2ed31ba8b9662ffeeb"} Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.318434 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3acbf6d3-0af6-49df-9884-7a79660c0d38","Type":"ContainerDied","Data":"e78060f272ffda4310640310e6334115da0c0d71f1473241a98c76d12624ed0a"} Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.321592 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" podStartSLOduration=4.321570826 podStartE2EDuration="4.321570826s" podCreationTimestamp="2025-09-30 13:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:30.316458708 +0000 UTC m=+1202.455018993" watchObservedRunningTime="2025-09-30 13:55:30.321570826 +0000 UTC m=+1202.460131111" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.330124 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a9f85e26-3d6b-4cd4-ae18-78d136626e63","Type":"ContainerStarted","Data":"bc22fd41d8157b9d25c4332b8e72e7b1afdc196b4e2b9e1bc2f0b2e12961dcdc"} Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.332653 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bj78g" Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.332701 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bj78g" event={"ID":"01d448d7-005e-443a-9931-01565aa7a5f1","Type":"ContainerDied","Data":"4f2107720801f5ec5686f4530bc5b7377aa18b062eb0761f055379e17ff887f7"} Sep 30 13:55:30 crc kubenswrapper[4763]: I0930 13:55:30.333430 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f2107720801f5ec5686f4530bc5b7377aa18b062eb0761f055379e17ff887f7" Sep 30 13:55:31 crc kubenswrapper[4763]: I0930 13:55:31.343654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a9f85e26-3d6b-4cd4-ae18-78d136626e63","Type":"ContainerStarted","Data":"4c429aeda2abf77f6f895adb18b29ed572d56bc9a84824b79b946f99a766190e"} Sep 30 13:55:31 crc kubenswrapper[4763]: I0930 13:55:31.343953 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 13:55:31 crc kubenswrapper[4763]: I0930 13:55:31.343792 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" containerName="cinder-api" containerID="cri-o://4c429aeda2abf77f6f895adb18b29ed572d56bc9a84824b79b946f99a766190e" gracePeriod=30 Sep 30 13:55:31 crc kubenswrapper[4763]: I0930 13:55:31.343753 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" containerName="cinder-api-log" containerID="cri-o://bc22fd41d8157b9d25c4332b8e72e7b1afdc196b4e2b9e1bc2f0b2e12961dcdc" gracePeriod=30 Sep 30 13:55:31 crc kubenswrapper[4763]: I0930 13:55:31.349895 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4ddf1156-f78b-43ce-bca1-44c026df8262","Type":"ContainerStarted","Data":"86e4a8eaca016f1864e0e310b89330cae9cc409c2648c7a69d63968d62aa1f69"} Sep 30 13:55:31 crc kubenswrapper[4763]: I0930 13:55:31.370150 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.370129231 podStartE2EDuration="5.370129231s" podCreationTimestamp="2025-09-30 13:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:31.36969572 +0000 UTC m=+1203.508256005" watchObservedRunningTime="2025-09-30 13:55:31.370129231 +0000 UTC m=+1203.508689516" Sep 30 13:55:31 crc kubenswrapper[4763]: I0930 13:55:31.388837 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.437787563 podStartE2EDuration="5.38881454s" podCreationTimestamp="2025-09-30 13:55:26 +0000 UTC" firstStartedPulling="2025-09-30 13:55:27.836414207 +0000 UTC m=+1199.974974492" lastFinishedPulling="2025-09-30 13:55:28.787441184 +0000 UTC m=+1200.926001469" observedRunningTime="2025-09-30 13:55:31.386258306 +0000 UTC m=+1203.524818591" watchObservedRunningTime="2025-09-30 13:55:31.38881454 +0000 UTC m=+1203.527374825" Sep 30 13:55:31 crc kubenswrapper[4763]: I0930 13:55:31.773239 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.365646 4763 generic.go:334] "Generic (PLEG): container finished" podID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" containerID="4c429aeda2abf77f6f895adb18b29ed572d56bc9a84824b79b946f99a766190e" exitCode=0 Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.365675 4763 generic.go:334] "Generic (PLEG): container finished" podID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" containerID="bc22fd41d8157b9d25c4332b8e72e7b1afdc196b4e2b9e1bc2f0b2e12961dcdc" exitCode=143 Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.366530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a9f85e26-3d6b-4cd4-ae18-78d136626e63","Type":"ContainerDied","Data":"4c429aeda2abf77f6f895adb18b29ed572d56bc9a84824b79b946f99a766190e"} Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.366559 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a9f85e26-3d6b-4cd4-ae18-78d136626e63","Type":"ContainerDied","Data":"bc22fd41d8157b9d25c4332b8e72e7b1afdc196b4e2b9e1bc2f0b2e12961dcdc"} Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.594525 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.612744 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75bcdb8fc9-ml4n8"] Sep 30 13:55:32 crc kubenswrapper[4763]: E0930 13:55:32.613144 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9793edca-7ca4-4ccc-8448-42b6897bb3b9" containerName="mariadb-database-create" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.613160 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9793edca-7ca4-4ccc-8448-42b6897bb3b9" containerName="mariadb-database-create" Sep 30 13:55:32 crc kubenswrapper[4763]: E0930 13:55:32.613168 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" containerName="cinder-api-log" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.613174 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" containerName="cinder-api-log" Sep 30 13:55:32 crc kubenswrapper[4763]: E0930 13:55:32.613190 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d448d7-005e-443a-9931-01565aa7a5f1" containerName="mariadb-database-create" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.613196 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d448d7-005e-443a-9931-01565aa7a5f1" containerName="mariadb-database-create" Sep 30 13:55:32 crc kubenswrapper[4763]: E0930 13:55:32.613207 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1296f29-0be2-4be4-bb9e-3670307d9d05" containerName="mariadb-database-create" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.613212 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1296f29-0be2-4be4-bb9e-3670307d9d05" containerName="mariadb-database-create" Sep 30 13:55:32 crc kubenswrapper[4763]: E0930 13:55:32.613225 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" containerName="cinder-api" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.613230 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" containerName="cinder-api" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.613427 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d448d7-005e-443a-9931-01565aa7a5f1" containerName="mariadb-database-create" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.613440 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" containerName="cinder-api" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.613453 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1296f29-0be2-4be4-bb9e-3670307d9d05" containerName="mariadb-database-create" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.613464 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9793edca-7ca4-4ccc-8448-42b6897bb3b9" containerName="mariadb-database-create" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.613474 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" containerName="cinder-api-log" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.614440 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.621000 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75bcdb8fc9-ml4n8"] Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.667121 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.667342 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.667384 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data-custom\") pod \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.667492 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f85e26-3d6b-4cd4-ae18-78d136626e63-logs\") pod \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.667528 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-combined-ca-bundle\") pod \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.667647 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9f85e26-3d6b-4cd4-ae18-78d136626e63-etc-machine-id\") pod \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.667731 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data\") pod \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.667823 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v85qt\" (UniqueName: \"kubernetes.io/projected/a9f85e26-3d6b-4cd4-ae18-78d136626e63-kube-api-access-v85qt\") pod \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.667896 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-scripts\") pod \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\" (UID: \"a9f85e26-3d6b-4cd4-ae18-78d136626e63\") " Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668007 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9f85e26-3d6b-4cd4-ae18-78d136626e63-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a9f85e26-3d6b-4cd4-ae18-78d136626e63" (UID: "a9f85e26-3d6b-4cd4-ae18-78d136626e63"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668148 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f85e26-3d6b-4cd4-ae18-78d136626e63-logs" (OuterVolumeSpecName: "logs") pod "a9f85e26-3d6b-4cd4-ae18-78d136626e63" (UID: "a9f85e26-3d6b-4cd4-ae18-78d136626e63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668156 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-config\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668301 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-combined-ca-bundle\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668484 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-public-tls-certs\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668536 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-ovndb-tls-certs\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668563 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-internal-tls-certs\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668700 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-httpd-config\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668752 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8zxh\" (UniqueName: \"kubernetes.io/projected/02c33b2c-ca4f-45a8-9920-63df9fc79108-kube-api-access-j8zxh\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668854 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f85e26-3d6b-4cd4-ae18-78d136626e63-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.668866 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9f85e26-3d6b-4cd4-ae18-78d136626e63-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.683486 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-scripts" (OuterVolumeSpecName: "scripts") pod "a9f85e26-3d6b-4cd4-ae18-78d136626e63" (UID: "a9f85e26-3d6b-4cd4-ae18-78d136626e63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.687020 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a9f85e26-3d6b-4cd4-ae18-78d136626e63" (UID: "a9f85e26-3d6b-4cd4-ae18-78d136626e63"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.691048 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f85e26-3d6b-4cd4-ae18-78d136626e63-kube-api-access-v85qt" (OuterVolumeSpecName: "kube-api-access-v85qt") pod "a9f85e26-3d6b-4cd4-ae18-78d136626e63" (UID: "a9f85e26-3d6b-4cd4-ae18-78d136626e63"). InnerVolumeSpecName "kube-api-access-v85qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.729632 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9f85e26-3d6b-4cd4-ae18-78d136626e63" (UID: "a9f85e26-3d6b-4cd4-ae18-78d136626e63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.770111 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-config\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.770331 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data" (OuterVolumeSpecName: "config-data") pod "a9f85e26-3d6b-4cd4-ae18-78d136626e63" (UID: "a9f85e26-3d6b-4cd4-ae18-78d136626e63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.770485 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-combined-ca-bundle\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.770750 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-public-tls-certs\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.770846 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-ovndb-tls-certs\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.770914 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-internal-tls-certs\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.771127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-httpd-config\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.771503 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8zxh\" (UniqueName: \"kubernetes.io/projected/02c33b2c-ca4f-45a8-9920-63df9fc79108-kube-api-access-j8zxh\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.771657 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.771730 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v85qt\" (UniqueName: \"kubernetes.io/projected/a9f85e26-3d6b-4cd4-ae18-78d136626e63-kube-api-access-v85qt\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.771792 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.771853 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.771917 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f85e26-3d6b-4cd4-ae18-78d136626e63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.782500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-public-tls-certs\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.783762 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.784048 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" containerName="glance-httpd" containerID="cri-o://a61f9690c80b1c238dbd56d88a7ef04eb303c9d58661a1e01a731d1e3c8f2914" gracePeriod=30 Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.784197 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-internal-tls-certs\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.784346 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" containerName="glance-log" containerID="cri-o://9c8915719c91e41929906d9743d00233d0186243330a277c24adefac9170aee0" gracePeriod=30 Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.790291 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-ovndb-tls-certs\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.791467 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8zxh\" (UniqueName: \"kubernetes.io/projected/02c33b2c-ca4f-45a8-9920-63df9fc79108-kube-api-access-j8zxh\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.792110 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-httpd-config\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.796205 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-combined-ca-bundle\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.796433 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-config\") pod \"neutron-75bcdb8fc9-ml4n8\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:32 crc kubenswrapper[4763]: I0930 13:55:32.878665 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.377858 4763 generic.go:334] "Generic (PLEG): container finished" podID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" containerID="9c8915719c91e41929906d9743d00233d0186243330a277c24adefac9170aee0" exitCode=143 Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.377972 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6","Type":"ContainerDied","Data":"9c8915719c91e41929906d9743d00233d0186243330a277c24adefac9170aee0"} Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.380085 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a9f85e26-3d6b-4cd4-ae18-78d136626e63","Type":"ContainerDied","Data":"db3eeb2c273f207f91e0dec0d34b5d5002b37541ba0e74f89c11bab0c3996eda"} Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.380128 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.380141 4763 scope.go:117] "RemoveContainer" containerID="4c429aeda2abf77f6f895adb18b29ed572d56bc9a84824b79b946f99a766190e" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.414358 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.416006 4763 scope.go:117] "RemoveContainer" containerID="bc22fd41d8157b9d25c4332b8e72e7b1afdc196b4e2b9e1bc2f0b2e12961dcdc" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.435866 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.448677 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75bcdb8fc9-ml4n8"] Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.475656 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.477404 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.480401 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.480645 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.480853 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.498950 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.587295 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data-custom\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.587367 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.587465 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.587578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b321cfd6-9039-4fe6-a39c-619f101d5e30-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.587650 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.587719 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-scripts\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.587764 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.587839 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b321cfd6-9039-4fe6-a39c-619f101d5e30-logs\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.587900 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj7kl\" (UniqueName: \"kubernetes.io/projected/b321cfd6-9039-4fe6-a39c-619f101d5e30-kube-api-access-dj7kl\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.689693 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj7kl\" (UniqueName: \"kubernetes.io/projected/b321cfd6-9039-4fe6-a39c-619f101d5e30-kube-api-access-dj7kl\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.690075 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data-custom\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.690093 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.690111 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.690700 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b321cfd6-9039-4fe6-a39c-619f101d5e30-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.690745 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b321cfd6-9039-4fe6-a39c-619f101d5e30-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.690758 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.690879 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-scripts\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.690920 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.690975 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b321cfd6-9039-4fe6-a39c-619f101d5e30-logs\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.692436 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b321cfd6-9039-4fe6-a39c-619f101d5e30-logs\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.695941 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.696688 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.697212 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-scripts\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.697541 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.697715 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data-custom\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.703846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.715344 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj7kl\" (UniqueName: \"kubernetes.io/projected/b321cfd6-9039-4fe6-a39c-619f101d5e30-kube-api-access-dj7kl\") pod \"cinder-api-0\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " pod="openstack/cinder-api-0" Sep 30 13:55:33 crc kubenswrapper[4763]: I0930 13:55:33.800612 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.266734 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:55:34 crc kubenswrapper[4763]: W0930 13:55:34.284000 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb321cfd6_9039_4fe6_a39c_619f101d5e30.slice/crio-7ed00235354744d2bb87a6a58240e6d5dd0fcab0d185bf8a571d80442c36ebbf WatchSource:0}: Error finding container 7ed00235354744d2bb87a6a58240e6d5dd0fcab0d185bf8a571d80442c36ebbf: Status 404 returned error can't find the container with id 7ed00235354744d2bb87a6a58240e6d5dd0fcab0d185bf8a571d80442c36ebbf Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.397806 4763 generic.go:334] "Generic (PLEG): container finished" podID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerID="b67209a39cc76a2d09a6f6508e0d53b09f7963f481bceb2a396c9759d3019c5a" exitCode=0 Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.397878 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3acbf6d3-0af6-49df-9884-7a79660c0d38","Type":"ContainerDied","Data":"b67209a39cc76a2d09a6f6508e0d53b09f7963f481bceb2a396c9759d3019c5a"} Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.405946 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b321cfd6-9039-4fe6-a39c-619f101d5e30","Type":"ContainerStarted","Data":"7ed00235354744d2bb87a6a58240e6d5dd0fcab0d185bf8a571d80442c36ebbf"} Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.412127 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75bcdb8fc9-ml4n8" event={"ID":"02c33b2c-ca4f-45a8-9920-63df9fc79108","Type":"ContainerStarted","Data":"d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb"} Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.412171 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75bcdb8fc9-ml4n8" event={"ID":"02c33b2c-ca4f-45a8-9920-63df9fc79108","Type":"ContainerStarted","Data":"e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb"} Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.412180 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75bcdb8fc9-ml4n8" event={"ID":"02c33b2c-ca4f-45a8-9920-63df9fc79108","Type":"ContainerStarted","Data":"04c023bec9e4143659e633d27f5d5e3b1f39ebaa173001f3ab9f9c3a93389c0d"} Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.412278 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.440649 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75bcdb8fc9-ml4n8" podStartSLOduration=2.44063122 podStartE2EDuration="2.44063122s" podCreationTimestamp="2025-09-30 13:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:34.430431984 +0000 UTC m=+1206.568992269" watchObservedRunningTime="2025-09-30 13:55:34.44063122 +0000 UTC m=+1206.579191505" Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.505248 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f85e26-3d6b-4cd4-ae18-78d136626e63" path="/var/lib/kubelet/pods/a9f85e26-3d6b-4cd4-ae18-78d136626e63/volumes" Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.819477 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.914385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-sg-core-conf-yaml\") pod \"3acbf6d3-0af6-49df-9884-7a79660c0d38\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.914451 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-combined-ca-bundle\") pod \"3acbf6d3-0af6-49df-9884-7a79660c0d38\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.914574 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-scripts\") pod \"3acbf6d3-0af6-49df-9884-7a79660c0d38\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.914645 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvl9z\" (UniqueName: \"kubernetes.io/projected/3acbf6d3-0af6-49df-9884-7a79660c0d38-kube-api-access-fvl9z\") pod \"3acbf6d3-0af6-49df-9884-7a79660c0d38\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.914710 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-run-httpd\") pod \"3acbf6d3-0af6-49df-9884-7a79660c0d38\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.914737 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-log-httpd\") pod \"3acbf6d3-0af6-49df-9884-7a79660c0d38\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.914790 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-config-data\") pod \"3acbf6d3-0af6-49df-9884-7a79660c0d38\" (UID: \"3acbf6d3-0af6-49df-9884-7a79660c0d38\") " Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.915372 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3acbf6d3-0af6-49df-9884-7a79660c0d38" (UID: "3acbf6d3-0af6-49df-9884-7a79660c0d38"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.915498 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3acbf6d3-0af6-49df-9884-7a79660c0d38" (UID: "3acbf6d3-0af6-49df-9884-7a79660c0d38"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.920183 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3acbf6d3-0af6-49df-9884-7a79660c0d38-kube-api-access-fvl9z" (OuterVolumeSpecName: "kube-api-access-fvl9z") pod "3acbf6d3-0af6-49df-9884-7a79660c0d38" (UID: "3acbf6d3-0af6-49df-9884-7a79660c0d38"). InnerVolumeSpecName "kube-api-access-fvl9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.920589 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-scripts" (OuterVolumeSpecName: "scripts") pod "3acbf6d3-0af6-49df-9884-7a79660c0d38" (UID: "3acbf6d3-0af6-49df-9884-7a79660c0d38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:34 crc kubenswrapper[4763]: I0930 13:55:34.964765 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3acbf6d3-0af6-49df-9884-7a79660c0d38" (UID: "3acbf6d3-0af6-49df-9884-7a79660c0d38"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.021368 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.021974 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3acbf6d3-0af6-49df-9884-7a79660c0d38-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.022039 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.022109 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.022231 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvl9z\" (UniqueName: \"kubernetes.io/projected/3acbf6d3-0af6-49df-9884-7a79660c0d38-kube-api-access-fvl9z\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.044895 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-config-data" (OuterVolumeSpecName: "config-data") pod "3acbf6d3-0af6-49df-9884-7a79660c0d38" (UID: "3acbf6d3-0af6-49df-9884-7a79660c0d38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.060456 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3acbf6d3-0af6-49df-9884-7a79660c0d38" (UID: "3acbf6d3-0af6-49df-9884-7a79660c0d38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.124010 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.124053 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3acbf6d3-0af6-49df-9884-7a79660c0d38-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.433351 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.433362 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3acbf6d3-0af6-49df-9884-7a79660c0d38","Type":"ContainerDied","Data":"81e6cb704768bfad243ca58e76052eea4884346782b0bae2d0c113da32235d49"} Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.433414 4763 scope.go:117] "RemoveContainer" containerID="4ac43ed1699efe54eac9b5d01a220fd631b9d25fdf740d9d8778c71039c57bc6" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.440667 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b321cfd6-9039-4fe6-a39c-619f101d5e30","Type":"ContainerStarted","Data":"b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11"} Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.470557 4763 scope.go:117] "RemoveContainer" containerID="a418a32191c8b2605067051c33630bba5162e74fdb861a2ed31ba8b9662ffeeb" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.473699 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.485534 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.521971 4763 scope.go:117] "RemoveContainer" containerID="e78060f272ffda4310640310e6334115da0c0d71f1473241a98c76d12624ed0a" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.527351 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:35 crc kubenswrapper[4763]: E0930 13:55:35.527778 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="ceilometer-central-agent" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.527794 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="ceilometer-central-agent" Sep 30 13:55:35 crc kubenswrapper[4763]: E0930 13:55:35.527809 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="proxy-httpd" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.527816 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="proxy-httpd" Sep 30 13:55:35 crc kubenswrapper[4763]: E0930 13:55:35.527824 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="ceilometer-notification-agent" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.527832 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="ceilometer-notification-agent" Sep 30 13:55:35 crc kubenswrapper[4763]: E0930 13:55:35.527846 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="sg-core" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.527852 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="sg-core" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.528015 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="ceilometer-central-agent" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.528023 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="sg-core" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.528036 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="proxy-httpd" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.528052 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" containerName="ceilometer-notification-agent" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.529575 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.531557 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.531878 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.537682 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.563188 4763 scope.go:117] "RemoveContainer" containerID="b67209a39cc76a2d09a6f6508e0d53b09f7963f481bceb2a396c9759d3019c5a" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.637566 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-log-httpd\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.637988 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-run-httpd\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.638014 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.638041 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-config-data\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.638058 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-scripts\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.638086 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.638137 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xxv2\" (UniqueName: \"kubernetes.io/projected/087a51db-aab7-45f7-986b-dbbe1c1edd39-kube-api-access-2xxv2\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.740223 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-run-httpd\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.740278 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.740323 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-config-data\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.740353 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-scripts\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.740392 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.740445 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xxv2\" (UniqueName: \"kubernetes.io/projected/087a51db-aab7-45f7-986b-dbbe1c1edd39-kube-api-access-2xxv2\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.740518 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-log-httpd\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.740670 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-run-httpd\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.741047 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-log-httpd\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.745806 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.746673 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-scripts\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.747017 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.748303 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-config-data\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.758842 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xxv2\" (UniqueName: \"kubernetes.io/projected/087a51db-aab7-45f7-986b-dbbe1c1edd39-kube-api-access-2xxv2\") pod \"ceilometer-0\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " pod="openstack/ceilometer-0" Sep 30 13:55:35 crc kubenswrapper[4763]: I0930 13:55:35.852617 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.341692 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.461901 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"087a51db-aab7-45f7-986b-dbbe1c1edd39","Type":"ContainerStarted","Data":"6d915f9a09593d925eb52131cf39fc7d4d3b0bee2e0ec4debfffb6b9698e5896"} Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.467787 4763 generic.go:334] "Generic (PLEG): container finished" podID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" containerID="a61f9690c80b1c238dbd56d88a7ef04eb303c9d58661a1e01a731d1e3c8f2914" exitCode=0 Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.468321 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6","Type":"ContainerDied","Data":"a61f9690c80b1c238dbd56d88a7ef04eb303c9d58661a1e01a731d1e3c8f2914"} Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.471491 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b321cfd6-9039-4fe6-a39c-619f101d5e30","Type":"ContainerStarted","Data":"ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc"} Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.471684 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.507590 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3acbf6d3-0af6-49df-9884-7a79660c0d38" path="/var/lib/kubelet/pods/3acbf6d3-0af6-49df-9884-7a79660c0d38/volumes" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.527136 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.546401 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.546380547 podStartE2EDuration="3.546380547s" podCreationTimestamp="2025-09-30 13:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:36.498545997 +0000 UTC m=+1208.637106292" watchObservedRunningTime="2025-09-30 13:55:36.546380547 +0000 UTC m=+1208.684940832" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.659910 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-config-data\") pod \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.659996 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-scripts\") pod \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.660062 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.660082 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-combined-ca-bundle\") pod \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.660125 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-internal-tls-certs\") pod \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.660161 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-logs\") pod \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.660216 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnw98\" (UniqueName: \"kubernetes.io/projected/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-kube-api-access-gnw98\") pod \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.660244 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-httpd-run\") pod \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\" (UID: \"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6\") " Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.660973 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" (UID: "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.661219 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-logs" (OuterVolumeSpecName: "logs") pod "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" (UID: "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.667008 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" (UID: "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.669687 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-kube-api-access-gnw98" (OuterVolumeSpecName: "kube-api-access-gnw98") pod "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" (UID: "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6"). InnerVolumeSpecName "kube-api-access-gnw98". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.672092 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-scripts" (OuterVolumeSpecName: "scripts") pod "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" (UID: "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.691761 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" (UID: "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.733204 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" (UID: "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.736444 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-config-data" (OuterVolumeSpecName: "config-data") pod "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" (UID: "6ce2bb5a-59f2-44ca-92ef-4b98681acdc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.759813 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.762163 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.762186 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.762199 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnw98\" (UniqueName: \"kubernetes.io/projected/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-kube-api-access-gnw98\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.762211 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.762221 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.762231 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.762265 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.762278 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.783883 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.814948 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-d4j9z"] Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.815231 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" podUID="816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" containerName="dnsmasq-dns" containerID="cri-o://f9a34c1f18e1e6a6311e5d6acc5d8c2592695c8e4e72b64dce792788c444b41b" gracePeriod=10 Sep 30 13:55:36 crc kubenswrapper[4763]: I0930 13:55:36.865200 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.086854 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.144688 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.500300 4763 generic.go:334] "Generic (PLEG): container finished" podID="816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" containerID="f9a34c1f18e1e6a6311e5d6acc5d8c2592695c8e4e72b64dce792788c444b41b" exitCode=0 Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.500364 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" event={"ID":"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c","Type":"ContainerDied","Data":"f9a34c1f18e1e6a6311e5d6acc5d8c2592695c8e4e72b64dce792788c444b41b"} Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.502963 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4ddf1156-f78b-43ce-bca1-44c026df8262" containerName="cinder-scheduler" containerID="cri-o://7bd3b941ddbec803ad6bb4b3d8baac0714afacce2bcc227f78f81e9bdaf2c7dc" gracePeriod=30 Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.503025 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ce2bb5a-59f2-44ca-92ef-4b98681acdc6","Type":"ContainerDied","Data":"0bb68e228b6fe7d933a8f605d2eb3f42b9e3c06c8345878b91c9779ed1a22422"} Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.503069 4763 scope.go:117] "RemoveContainer" containerID="a61f9690c80b1c238dbd56d88a7ef04eb303c9d58661a1e01a731d1e3c8f2914" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.503289 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4ddf1156-f78b-43ce-bca1-44c026df8262" containerName="probe" containerID="cri-o://86e4a8eaca016f1864e0e310b89330cae9cc409c2648c7a69d63968d62aa1f69" gracePeriod=30 Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.503469 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.556779 4763 scope.go:117] "RemoveContainer" containerID="9c8915719c91e41929906d9743d00233d0186243330a277c24adefac9170aee0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.564058 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.580658 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.591301 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:37 crc kubenswrapper[4763]: E0930 13:55:37.592100 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" containerName="glance-log" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.592118 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" containerName="glance-log" Sep 30 13:55:37 crc kubenswrapper[4763]: E0930 13:55:37.592137 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" containerName="glance-httpd" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.592146 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" containerName="glance-httpd" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.592361 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" containerName="glance-httpd" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.592377 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" containerName="glance-log" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.593821 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.597988 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.598350 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.607668 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.689883 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-logs\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.689957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.689986 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.690091 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.690129 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.690149 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.690191 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxsth\" (UniqueName: \"kubernetes.io/projected/ce55d11a-887c-46e6-af05-90c3fca01e75-kube-api-access-sxsth\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.690236 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.800653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.800710 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.800731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.800763 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxsth\" (UniqueName: \"kubernetes.io/projected/ce55d11a-887c-46e6-af05-90c3fca01e75-kube-api-access-sxsth\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.800803 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.800824 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-logs\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.800853 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.800872 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.808384 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-logs\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.808512 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.808797 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.814182 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.816357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.827849 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.829535 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.843478 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxsth\" (UniqueName: \"kubernetes.io/projected/ce55d11a-887c-46e6-af05-90c3fca01e75-kube-api-access-sxsth\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.861978 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " pod="openstack/glance-default-internal-api-0" Sep 30 13:55:37 crc kubenswrapper[4763]: I0930 13:55:37.933142 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.200030 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.308921 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-sb\") pod \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.309064 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-svc\") pod \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.309153 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-nb\") pod \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.309206 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-swift-storage-0\") pod \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.309254 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-config\") pod \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.309282 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbsn2\" (UniqueName: \"kubernetes.io/projected/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-kube-api-access-lbsn2\") pod \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\" (UID: \"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c\") " Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.314548 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-kube-api-access-lbsn2" (OuterVolumeSpecName: "kube-api-access-lbsn2") pod "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" (UID: "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c"). InnerVolumeSpecName "kube-api-access-lbsn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.363313 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" (UID: "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.363964 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" (UID: "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.366035 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" (UID: "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.372185 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" (UID: "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.374085 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-config" (OuterVolumeSpecName: "config") pod "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" (UID: "816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.411324 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.411363 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.411376 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.411388 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.411402 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.411415 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbsn2\" (UniqueName: \"kubernetes.io/projected/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c-kube-api-access-lbsn2\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.525789 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce2bb5a-59f2-44ca-92ef-4b98681acdc6" path="/var/lib/kubelet/pods/6ce2bb5a-59f2-44ca-92ef-4b98681acdc6/volumes" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.528682 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.529260 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c856dc5f9-d4j9z" event={"ID":"816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c","Type":"ContainerDied","Data":"e97365fc1ef4b818e47d073550b6b78882a09b14b45a2bf8d3212128e27a1349"} Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.529329 4763 scope.go:117] "RemoveContainer" containerID="f9a34c1f18e1e6a6311e5d6acc5d8c2592695c8e4e72b64dce792788c444b41b" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.539050 4763 generic.go:334] "Generic (PLEG): container finished" podID="4ddf1156-f78b-43ce-bca1-44c026df8262" containerID="86e4a8eaca016f1864e0e310b89330cae9cc409c2648c7a69d63968d62aa1f69" exitCode=0 Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.539110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4ddf1156-f78b-43ce-bca1-44c026df8262","Type":"ContainerDied","Data":"86e4a8eaca016f1864e0e310b89330cae9cc409c2648c7a69d63968d62aa1f69"} Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.545457 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"087a51db-aab7-45f7-986b-dbbe1c1edd39","Type":"ContainerStarted","Data":"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4"} Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.563220 4763 scope.go:117] "RemoveContainer" containerID="d4e2a792cf4575c78a6c335d5016196dbad39501ff3a8c770eb5c7136d095ee4" Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.575115 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:55:38 crc kubenswrapper[4763]: W0930 13:55:38.580532 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce55d11a_887c_46e6_af05_90c3fca01e75.slice/crio-72fe39f005d969bb9005ec3d228a785f291680c2e081a734c563831b7f077f26 WatchSource:0}: Error finding container 72fe39f005d969bb9005ec3d228a785f291680c2e081a734c563831b7f077f26: Status 404 returned error can't find the container with id 72fe39f005d969bb9005ec3d228a785f291680c2e081a734c563831b7f077f26 Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.587480 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-d4j9z"] Sep 30 13:55:38 crc kubenswrapper[4763]: I0930 13:55:38.596973 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-d4j9z"] Sep 30 13:55:39 crc kubenswrapper[4763]: I0930 13:55:39.593713 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce55d11a-887c-46e6-af05-90c3fca01e75","Type":"ContainerStarted","Data":"3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d"} Sep 30 13:55:39 crc kubenswrapper[4763]: I0930 13:55:39.594089 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce55d11a-887c-46e6-af05-90c3fca01e75","Type":"ContainerStarted","Data":"72fe39f005d969bb9005ec3d228a785f291680c2e081a734c563831b7f077f26"} Sep 30 13:55:39 crc kubenswrapper[4763]: I0930 13:55:39.603457 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"087a51db-aab7-45f7-986b-dbbe1c1edd39","Type":"ContainerStarted","Data":"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944"} Sep 30 13:55:40 crc kubenswrapper[4763]: I0930 13:55:40.498936 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" path="/var/lib/kubelet/pods/816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c/volumes" Sep 30 13:55:40 crc kubenswrapper[4763]: I0930 13:55:40.613120 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce55d11a-887c-46e6-af05-90c3fca01e75","Type":"ContainerStarted","Data":"26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a"} Sep 30 13:55:40 crc kubenswrapper[4763]: I0930 13:55:40.616708 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"087a51db-aab7-45f7-986b-dbbe1c1edd39","Type":"ContainerStarted","Data":"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31"} Sep 30 13:55:40 crc kubenswrapper[4763]: I0930 13:55:40.640910 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.640885555 podStartE2EDuration="3.640885555s" podCreationTimestamp="2025-09-30 13:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:40.634582277 +0000 UTC m=+1212.773142562" watchObservedRunningTime="2025-09-30 13:55:40.640885555 +0000 UTC m=+1212.779445840" Sep 30 13:55:41 crc kubenswrapper[4763]: I0930 13:55:41.627047 4763 generic.go:334] "Generic (PLEG): container finished" podID="4ddf1156-f78b-43ce-bca1-44c026df8262" containerID="7bd3b941ddbec803ad6bb4b3d8baac0714afacce2bcc227f78f81e9bdaf2c7dc" exitCode=0 Sep 30 13:55:41 crc kubenswrapper[4763]: I0930 13:55:41.627129 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4ddf1156-f78b-43ce-bca1-44c026df8262","Type":"ContainerDied","Data":"7bd3b941ddbec803ad6bb4b3d8baac0714afacce2bcc227f78f81e9bdaf2c7dc"} Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.348637 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.400561 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data\") pod \"4ddf1156-f78b-43ce-bca1-44c026df8262\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.400732 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-scripts\") pod \"4ddf1156-f78b-43ce-bca1-44c026df8262\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.400764 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data-custom\") pod \"4ddf1156-f78b-43ce-bca1-44c026df8262\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.400924 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ddf1156-f78b-43ce-bca1-44c026df8262-etc-machine-id\") pod \"4ddf1156-f78b-43ce-bca1-44c026df8262\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.400984 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdxpf\" (UniqueName: \"kubernetes.io/projected/4ddf1156-f78b-43ce-bca1-44c026df8262-kube-api-access-hdxpf\") pod \"4ddf1156-f78b-43ce-bca1-44c026df8262\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.401021 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-combined-ca-bundle\") pod \"4ddf1156-f78b-43ce-bca1-44c026df8262\" (UID: \"4ddf1156-f78b-43ce-bca1-44c026df8262\") " Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.401084 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ddf1156-f78b-43ce-bca1-44c026df8262-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4ddf1156-f78b-43ce-bca1-44c026df8262" (UID: "4ddf1156-f78b-43ce-bca1-44c026df8262"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.401616 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ddf1156-f78b-43ce-bca1-44c026df8262-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.406216 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ddf1156-f78b-43ce-bca1-44c026df8262-kube-api-access-hdxpf" (OuterVolumeSpecName: "kube-api-access-hdxpf") pod "4ddf1156-f78b-43ce-bca1-44c026df8262" (UID: "4ddf1156-f78b-43ce-bca1-44c026df8262"). InnerVolumeSpecName "kube-api-access-hdxpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.409849 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-scripts" (OuterVolumeSpecName: "scripts") pod "4ddf1156-f78b-43ce-bca1-44c026df8262" (UID: "4ddf1156-f78b-43ce-bca1-44c026df8262"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.409964 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4ddf1156-f78b-43ce-bca1-44c026df8262" (UID: "4ddf1156-f78b-43ce-bca1-44c026df8262"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.485367 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ddf1156-f78b-43ce-bca1-44c026df8262" (UID: "4ddf1156-f78b-43ce-bca1-44c026df8262"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.503817 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdxpf\" (UniqueName: \"kubernetes.io/projected/4ddf1156-f78b-43ce-bca1-44c026df8262-kube-api-access-hdxpf\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.503842 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.503855 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.503865 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.531756 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data" (OuterVolumeSpecName: "config-data") pod "4ddf1156-f78b-43ce-bca1-44c026df8262" (UID: "4ddf1156-f78b-43ce-bca1-44c026df8262"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.605148 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ddf1156-f78b-43ce-bca1-44c026df8262-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.640267 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"087a51db-aab7-45f7-986b-dbbe1c1edd39","Type":"ContainerStarted","Data":"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28"} Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.641719 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.644278 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4ddf1156-f78b-43ce-bca1-44c026df8262","Type":"ContainerDied","Data":"56e2bc3916f11b1951ac866a737a94f7758d437d6490680a0a00230c5710bfa7"} Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.644334 4763 scope.go:117] "RemoveContainer" containerID="86e4a8eaca016f1864e0e310b89330cae9cc409c2648c7a69d63968d62aa1f69" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.644467 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.667990 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.573941476 podStartE2EDuration="7.667971398s" podCreationTimestamp="2025-09-30 13:55:35 +0000 UTC" firstStartedPulling="2025-09-30 13:55:36.341867525 +0000 UTC m=+1208.480427820" lastFinishedPulling="2025-09-30 13:55:41.435897447 +0000 UTC m=+1213.574457742" observedRunningTime="2025-09-30 13:55:42.664471411 +0000 UTC m=+1214.803031706" watchObservedRunningTime="2025-09-30 13:55:42.667971398 +0000 UTC m=+1214.806531683" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.683362 4763 scope.go:117] "RemoveContainer" containerID="7bd3b941ddbec803ad6bb4b3d8baac0714afacce2bcc227f78f81e9bdaf2c7dc" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.687723 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.694216 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.708640 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:55:42 crc kubenswrapper[4763]: E0930 13:55:42.709018 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ddf1156-f78b-43ce-bca1-44c026df8262" containerName="cinder-scheduler" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.709035 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ddf1156-f78b-43ce-bca1-44c026df8262" containerName="cinder-scheduler" Sep 30 13:55:42 crc kubenswrapper[4763]: E0930 13:55:42.709048 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ddf1156-f78b-43ce-bca1-44c026df8262" containerName="probe" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.709055 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ddf1156-f78b-43ce-bca1-44c026df8262" containerName="probe" Sep 30 13:55:42 crc kubenswrapper[4763]: E0930 13:55:42.709078 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" containerName="init" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.709084 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" containerName="init" Sep 30 13:55:42 crc kubenswrapper[4763]: E0930 13:55:42.709102 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" containerName="dnsmasq-dns" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.709107 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" containerName="dnsmasq-dns" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.709270 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ddf1156-f78b-43ce-bca1-44c026df8262" containerName="probe" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.709293 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ddf1156-f78b-43ce-bca1-44c026df8262" containerName="cinder-scheduler" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.709306 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="816d9b64-2c0b-4b24-822d-fe8ec3b6ea9c" containerName="dnsmasq-dns" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.710220 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.712471 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.733724 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.810477 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-scripts\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.810661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7gtt\" (UniqueName: \"kubernetes.io/projected/8bb91013-85e0-4a13-9a06-0608b16a147b-kube-api-access-z7gtt\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.810748 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.810837 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.810939 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.811098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bb91013-85e0-4a13-9a06-0608b16a147b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.912358 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.912429 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bb91013-85e0-4a13-9a06-0608b16a147b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.912509 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-scripts\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.912529 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7gtt\" (UniqueName: \"kubernetes.io/projected/8bb91013-85e0-4a13-9a06-0608b16a147b-kube-api-access-z7gtt\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.912555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.912574 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.913049 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bb91013-85e0-4a13-9a06-0608b16a147b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.921303 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-scripts\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.923373 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.926360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.940558 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:42 crc kubenswrapper[4763]: I0930 13:55:42.959178 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7gtt\" (UniqueName: \"kubernetes.io/projected/8bb91013-85e0-4a13-9a06-0608b16a147b-kube-api-access-z7gtt\") pod \"cinder-scheduler-0\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:55:43 crc kubenswrapper[4763]: I0930 13:55:43.026126 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:55:43 crc kubenswrapper[4763]: I0930 13:55:43.464855 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:43 crc kubenswrapper[4763]: I0930 13:55:43.465102 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" containerName="glance-log" containerID="cri-o://4d8542f41273e622d163dbe4ad7305ecf9a69c0a41ec43ac9d60f5cdcb8a0328" gracePeriod=30 Sep 30 13:55:43 crc kubenswrapper[4763]: I0930 13:55:43.465223 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" containerName="glance-httpd" containerID="cri-o://0a9f5f188ca7ae4e254395e48120dcd08ab2e6b2c8ba46ff6fd04b550ba23940" gracePeriod=30 Sep 30 13:55:43 crc kubenswrapper[4763]: I0930 13:55:43.511589 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:55:43 crc kubenswrapper[4763]: W0930 13:55:43.514877 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bb91013_85e0_4a13_9a06_0608b16a147b.slice/crio-26c6f327823006b666d992af446a7422c031f313b3bfcbafd3c3d69d64cfc3cd WatchSource:0}: Error finding container 26c6f327823006b666d992af446a7422c031f313b3bfcbafd3c3d69d64cfc3cd: Status 404 returned error can't find the container with id 26c6f327823006b666d992af446a7422c031f313b3bfcbafd3c3d69d64cfc3cd Sep 30 13:55:43 crc kubenswrapper[4763]: I0930 13:55:43.675424 4763 generic.go:334] "Generic (PLEG): container finished" podID="f9249dc6-ff96-4198-b25e-7362067617ab" containerID="4d8542f41273e622d163dbe4ad7305ecf9a69c0a41ec43ac9d60f5cdcb8a0328" exitCode=143 Sep 30 13:55:43 crc kubenswrapper[4763]: I0930 13:55:43.675490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f9249dc6-ff96-4198-b25e-7362067617ab","Type":"ContainerDied","Data":"4d8542f41273e622d163dbe4ad7305ecf9a69c0a41ec43ac9d60f5cdcb8a0328"} Sep 30 13:55:43 crc kubenswrapper[4763]: I0930 13:55:43.678407 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bb91013-85e0-4a13-9a06-0608b16a147b","Type":"ContainerStarted","Data":"26c6f327823006b666d992af446a7422c031f313b3bfcbafd3c3d69d64cfc3cd"} Sep 30 13:55:44 crc kubenswrapper[4763]: I0930 13:55:44.501402 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ddf1156-f78b-43ce-bca1-44c026df8262" path="/var/lib/kubelet/pods/4ddf1156-f78b-43ce-bca1-44c026df8262/volumes" Sep 30 13:55:44 crc kubenswrapper[4763]: I0930 13:55:44.722932 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bb91013-85e0-4a13-9a06-0608b16a147b","Type":"ContainerStarted","Data":"cba654c3201589fdfeff007899f0f73ab1e63e34fc4ccb4f54ba1464e5755d0a"} Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.731765 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bb91013-85e0-4a13-9a06-0608b16a147b","Type":"ContainerStarted","Data":"e1f5778fa17d7cddfefc9145f7fd206fb41d3fa0e3cf06f8bd8e12eb8a451d1b"} Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.741518 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.741495353 podStartE2EDuration="3.741495353s" podCreationTimestamp="2025-09-30 13:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:44.748853041 +0000 UTC m=+1216.887413326" watchObservedRunningTime="2025-09-30 13:55:45.741495353 +0000 UTC m=+1217.880055638" Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.745646 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-bbaf-account-create-dzmfz"] Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.749212 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bbaf-account-create-dzmfz" Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.751556 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.756266 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bbaf-account-create-dzmfz"] Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.762733 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvsh6\" (UniqueName: \"kubernetes.io/projected/64ed31b0-29b4-49fb-9a0b-0c07ef07706c-kube-api-access-wvsh6\") pod \"nova-api-bbaf-account-create-dzmfz\" (UID: \"64ed31b0-29b4-49fb-9a0b-0c07ef07706c\") " pod="openstack/nova-api-bbaf-account-create-dzmfz" Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.864762 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvsh6\" (UniqueName: \"kubernetes.io/projected/64ed31b0-29b4-49fb-9a0b-0c07ef07706c-kube-api-access-wvsh6\") pod \"nova-api-bbaf-account-create-dzmfz\" (UID: \"64ed31b0-29b4-49fb-9a0b-0c07ef07706c\") " pod="openstack/nova-api-bbaf-account-create-dzmfz" Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.868304 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.921566 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvsh6\" (UniqueName: \"kubernetes.io/projected/64ed31b0-29b4-49fb-9a0b-0c07ef07706c-kube-api-access-wvsh6\") pod \"nova-api-bbaf-account-create-dzmfz\" (UID: \"64ed31b0-29b4-49fb-9a0b-0c07ef07706c\") " pod="openstack/nova-api-bbaf-account-create-dzmfz" Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.944272 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-903f-account-create-j2fds"] Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.945389 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-903f-account-create-j2fds" Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.950240 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.966497 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wnv2\" (UniqueName: \"kubernetes.io/projected/56d7990d-14a8-4872-ad27-85dc01c63f23-kube-api-access-7wnv2\") pod \"nova-cell0-903f-account-create-j2fds\" (UID: \"56d7990d-14a8-4872-ad27-85dc01c63f23\") " pod="openstack/nova-cell0-903f-account-create-j2fds" Sep 30 13:55:45 crc kubenswrapper[4763]: I0930 13:55:45.966863 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-903f-account-create-j2fds"] Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.066895 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bbaf-account-create-dzmfz" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.068509 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wnv2\" (UniqueName: \"kubernetes.io/projected/56d7990d-14a8-4872-ad27-85dc01c63f23-kube-api-access-7wnv2\") pod \"nova-cell0-903f-account-create-j2fds\" (UID: \"56d7990d-14a8-4872-ad27-85dc01c63f23\") " pod="openstack/nova-cell0-903f-account-create-j2fds" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.091570 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wnv2\" (UniqueName: \"kubernetes.io/projected/56d7990d-14a8-4872-ad27-85dc01c63f23-kube-api-access-7wnv2\") pod \"nova-cell0-903f-account-create-j2fds\" (UID: \"56d7990d-14a8-4872-ad27-85dc01c63f23\") " pod="openstack/nova-cell0-903f-account-create-j2fds" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.144015 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4d85-account-create-92v9x"] Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.145183 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4d85-account-create-92v9x" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.147842 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.156398 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4d85-account-create-92v9x"] Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.177571 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sgdq\" (UniqueName: \"kubernetes.io/projected/7f4dee33-3e98-4741-9001-bb28578a2a24-kube-api-access-2sgdq\") pod \"nova-cell1-4d85-account-create-92v9x\" (UID: \"7f4dee33-3e98-4741-9001-bb28578a2a24\") " pod="openstack/nova-cell1-4d85-account-create-92v9x" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.268088 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-903f-account-create-j2fds" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.282883 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sgdq\" (UniqueName: \"kubernetes.io/projected/7f4dee33-3e98-4741-9001-bb28578a2a24-kube-api-access-2sgdq\") pod \"nova-cell1-4d85-account-create-92v9x\" (UID: \"7f4dee33-3e98-4741-9001-bb28578a2a24\") " pod="openstack/nova-cell1-4d85-account-create-92v9x" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.304174 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sgdq\" (UniqueName: \"kubernetes.io/projected/7f4dee33-3e98-4741-9001-bb28578a2a24-kube-api-access-2sgdq\") pod \"nova-cell1-4d85-account-create-92v9x\" (UID: \"7f4dee33-3e98-4741-9001-bb28578a2a24\") " pod="openstack/nova-cell1-4d85-account-create-92v9x" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.507422 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4d85-account-create-92v9x" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.643426 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bbaf-account-create-dzmfz"] Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.643558 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9292/healthcheck\": read tcp 10.217.0.2:55266->10.217.0.160:9292: read: connection reset by peer" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.644312 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.160:9292/healthcheck\": read tcp 10.217.0.2:55274->10.217.0.160:9292: read: connection reset by peer" Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.760844 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bbaf-account-create-dzmfz" event={"ID":"64ed31b0-29b4-49fb-9a0b-0c07ef07706c","Type":"ContainerStarted","Data":"6f96078611c8deca34587d9c7e0bd217a1551a5ef56170200b8a0dbe9ef7b2b0"} Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.771916 4763 generic.go:334] "Generic (PLEG): container finished" podID="f9249dc6-ff96-4198-b25e-7362067617ab" containerID="0a9f5f188ca7ae4e254395e48120dcd08ab2e6b2c8ba46ff6fd04b550ba23940" exitCode=0 Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.773306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f9249dc6-ff96-4198-b25e-7362067617ab","Type":"ContainerDied","Data":"0a9f5f188ca7ae4e254395e48120dcd08ab2e6b2c8ba46ff6fd04b550ba23940"} Sep 30 13:55:46 crc kubenswrapper[4763]: I0930 13:55:46.793673 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-903f-account-create-j2fds"] Sep 30 13:55:46 crc kubenswrapper[4763]: E0930 13:55:46.879418 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9249dc6_ff96_4198_b25e_7362067617ab.slice/crio-0a9f5f188ca7ae4e254395e48120dcd08ab2e6b2c8ba46ff6fd04b550ba23940.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9249dc6_ff96_4198_b25e_7362067617ab.slice/crio-conmon-0a9f5f188ca7ae4e254395e48120dcd08ab2e6b2c8ba46ff6fd04b550ba23940.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.002160 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4d85-account-create-92v9x"] Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.255200 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.314284 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-scripts\") pod \"f9249dc6-ff96-4198-b25e-7362067617ab\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.314481 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-public-tls-certs\") pod \"f9249dc6-ff96-4198-b25e-7362067617ab\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.314652 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-config-data\") pod \"f9249dc6-ff96-4198-b25e-7362067617ab\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.315418 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-combined-ca-bundle\") pod \"f9249dc6-ff96-4198-b25e-7362067617ab\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.315503 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"f9249dc6-ff96-4198-b25e-7362067617ab\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.315539 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-logs\") pod \"f9249dc6-ff96-4198-b25e-7362067617ab\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.315587 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-httpd-run\") pod \"f9249dc6-ff96-4198-b25e-7362067617ab\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.315641 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxv62\" (UniqueName: \"kubernetes.io/projected/f9249dc6-ff96-4198-b25e-7362067617ab-kube-api-access-vxv62\") pod \"f9249dc6-ff96-4198-b25e-7362067617ab\" (UID: \"f9249dc6-ff96-4198-b25e-7362067617ab\") " Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.316198 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-logs" (OuterVolumeSpecName: "logs") pod "f9249dc6-ff96-4198-b25e-7362067617ab" (UID: "f9249dc6-ff96-4198-b25e-7362067617ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.316325 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.317961 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f9249dc6-ff96-4198-b25e-7362067617ab" (UID: "f9249dc6-ff96-4198-b25e-7362067617ab"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.328144 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-scripts" (OuterVolumeSpecName: "scripts") pod "f9249dc6-ff96-4198-b25e-7362067617ab" (UID: "f9249dc6-ff96-4198-b25e-7362067617ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.329119 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "f9249dc6-ff96-4198-b25e-7362067617ab" (UID: "f9249dc6-ff96-4198-b25e-7362067617ab"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.332855 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9249dc6-ff96-4198-b25e-7362067617ab-kube-api-access-vxv62" (OuterVolumeSpecName: "kube-api-access-vxv62") pod "f9249dc6-ff96-4198-b25e-7362067617ab" (UID: "f9249dc6-ff96-4198-b25e-7362067617ab"). InnerVolumeSpecName "kube-api-access-vxv62". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.391630 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9249dc6-ff96-4198-b25e-7362067617ab" (UID: "f9249dc6-ff96-4198-b25e-7362067617ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.412229 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f9249dc6-ff96-4198-b25e-7362067617ab" (UID: "f9249dc6-ff96-4198-b25e-7362067617ab"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.423120 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.423148 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.423160 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.423181 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.423190 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9249dc6-ff96-4198-b25e-7362067617ab-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.423200 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxv62\" (UniqueName: \"kubernetes.io/projected/f9249dc6-ff96-4198-b25e-7362067617ab-kube-api-access-vxv62\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.426548 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-config-data" (OuterVolumeSpecName: "config-data") pod "f9249dc6-ff96-4198-b25e-7362067617ab" (UID: "f9249dc6-ff96-4198-b25e-7362067617ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.442473 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.524669 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9249dc6-ff96-4198-b25e-7362067617ab-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.524705 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.781983 4763 generic.go:334] "Generic (PLEG): container finished" podID="64ed31b0-29b4-49fb-9a0b-0c07ef07706c" containerID="8227bad438fba7b4abeec80dc38b2de97381f23f132108eaf0c0183b892f50cb" exitCode=0 Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.782032 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bbaf-account-create-dzmfz" event={"ID":"64ed31b0-29b4-49fb-9a0b-0c07ef07706c","Type":"ContainerDied","Data":"8227bad438fba7b4abeec80dc38b2de97381f23f132108eaf0c0183b892f50cb"} Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.783693 4763 generic.go:334] "Generic (PLEG): container finished" podID="56d7990d-14a8-4872-ad27-85dc01c63f23" containerID="d294dc3212ae359102de89c32bffc910fda43a40e264c95e34fc8674cd634dfc" exitCode=0 Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.783805 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-903f-account-create-j2fds" event={"ID":"56d7990d-14a8-4872-ad27-85dc01c63f23","Type":"ContainerDied","Data":"d294dc3212ae359102de89c32bffc910fda43a40e264c95e34fc8674cd634dfc"} Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.784064 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-903f-account-create-j2fds" event={"ID":"56d7990d-14a8-4872-ad27-85dc01c63f23","Type":"ContainerStarted","Data":"8f33d957696aa8eca4f61f6b97ac2d787756d76fae6834c257f5109f5fa39304"} Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.785506 4763 generic.go:334] "Generic (PLEG): container finished" podID="7f4dee33-3e98-4741-9001-bb28578a2a24" containerID="91399e58f45ab5b2a692a69ce7f59195c5667685f1af70d93034455a54845843" exitCode=0 Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.785536 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4d85-account-create-92v9x" event={"ID":"7f4dee33-3e98-4741-9001-bb28578a2a24","Type":"ContainerDied","Data":"91399e58f45ab5b2a692a69ce7f59195c5667685f1af70d93034455a54845843"} Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.785566 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4d85-account-create-92v9x" event={"ID":"7f4dee33-3e98-4741-9001-bb28578a2a24","Type":"ContainerStarted","Data":"52ef8f40118a935b85e04f31eaf48d59536fa213177d49541e03fa985d9e1e9c"} Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.787515 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f9249dc6-ff96-4198-b25e-7362067617ab","Type":"ContainerDied","Data":"e584877617c206f005dc1e88ac507f72d7994b10804d6e1690868effe46703be"} Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.787554 4763 scope.go:117] "RemoveContainer" containerID="0a9f5f188ca7ae4e254395e48120dcd08ab2e6b2c8ba46ff6fd04b550ba23940" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.787572 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.828342 4763 scope.go:117] "RemoveContainer" containerID="4d8542f41273e622d163dbe4ad7305ecf9a69c0a41ec43ac9d60f5cdcb8a0328" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.855347 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.869049 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.889034 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:47 crc kubenswrapper[4763]: E0930 13:55:47.889455 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" containerName="glance-log" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.889473 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" containerName="glance-log" Sep 30 13:55:47 crc kubenswrapper[4763]: E0930 13:55:47.889500 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" containerName="glance-httpd" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.889508 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" containerName="glance-httpd" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.889698 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" containerName="glance-httpd" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.889723 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" containerName="glance-log" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.907580 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.907901 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="ceilometer-central-agent" containerID="cri-o://305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4" gracePeriod=30 Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.908073 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.908832 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="sg-core" containerID="cri-o://3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31" gracePeriod=30 Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.908947 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="ceilometer-notification-agent" containerID="cri-o://555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944" gracePeriod=30 Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.909146 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="proxy-httpd" containerID="cri-o://2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28" gracePeriod=30 Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.913622 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.914923 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.916905 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.933809 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.933856 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.976633 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:47 crc kubenswrapper[4763]: I0930 13:55:47.989061 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.026341 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.035013 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-logs\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.035084 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.035140 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xshrq\" (UniqueName: \"kubernetes.io/projected/ba159e27-7a3b-4b90-a7db-de6135f8153c-kube-api-access-xshrq\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.035202 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.035220 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.035238 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.035316 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.035347 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.136552 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.136618 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.136653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-logs\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.136719 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.136794 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xshrq\" (UniqueName: \"kubernetes.io/projected/ba159e27-7a3b-4b90-a7db-de6135f8153c-kube-api-access-xshrq\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.136851 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.136872 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.136893 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.137412 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-logs\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.138714 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.138794 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.142714 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.159360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.159509 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.159948 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.162649 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xshrq\" (UniqueName: \"kubernetes.io/projected/ba159e27-7a3b-4b90-a7db-de6135f8153c-kube-api-access-xshrq\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.177482 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.280716 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.559912 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9249dc6-ff96-4198-b25e-7362067617ab" path="/var/lib/kubelet/pods/f9249dc6-ff96-4198-b25e-7362067617ab/volumes" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.761019 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.811959 4763 generic.go:334] "Generic (PLEG): container finished" podID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerID="2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28" exitCode=0 Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.811988 4763 generic.go:334] "Generic (PLEG): container finished" podID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerID="3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31" exitCode=2 Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.811997 4763 generic.go:334] "Generic (PLEG): container finished" podID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerID="555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944" exitCode=0 Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.812004 4763 generic.go:334] "Generic (PLEG): container finished" podID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerID="305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4" exitCode=0 Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.812203 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.813120 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"087a51db-aab7-45f7-986b-dbbe1c1edd39","Type":"ContainerDied","Data":"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28"} Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.813175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"087a51db-aab7-45f7-986b-dbbe1c1edd39","Type":"ContainerDied","Data":"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31"} Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.813188 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"087a51db-aab7-45f7-986b-dbbe1c1edd39","Type":"ContainerDied","Data":"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944"} Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.813197 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"087a51db-aab7-45f7-986b-dbbe1c1edd39","Type":"ContainerDied","Data":"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4"} Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.813205 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"087a51db-aab7-45f7-986b-dbbe1c1edd39","Type":"ContainerDied","Data":"6d915f9a09593d925eb52131cf39fc7d4d3b0bee2e0ec4debfffb6b9698e5896"} Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.813220 4763 scope.go:117] "RemoveContainer" containerID="2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.814859 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.814888 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.862871 4763 scope.go:117] "RemoveContainer" containerID="3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.864062 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-scripts\") pod \"087a51db-aab7-45f7-986b-dbbe1c1edd39\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.864264 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-log-httpd\") pod \"087a51db-aab7-45f7-986b-dbbe1c1edd39\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.864293 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xxv2\" (UniqueName: \"kubernetes.io/projected/087a51db-aab7-45f7-986b-dbbe1c1edd39-kube-api-access-2xxv2\") pod \"087a51db-aab7-45f7-986b-dbbe1c1edd39\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.864336 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-sg-core-conf-yaml\") pod \"087a51db-aab7-45f7-986b-dbbe1c1edd39\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.864390 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-combined-ca-bundle\") pod \"087a51db-aab7-45f7-986b-dbbe1c1edd39\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.864410 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-run-httpd\") pod \"087a51db-aab7-45f7-986b-dbbe1c1edd39\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.864474 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-config-data\") pod \"087a51db-aab7-45f7-986b-dbbe1c1edd39\" (UID: \"087a51db-aab7-45f7-986b-dbbe1c1edd39\") " Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.867411 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "087a51db-aab7-45f7-986b-dbbe1c1edd39" (UID: "087a51db-aab7-45f7-986b-dbbe1c1edd39"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.870548 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "087a51db-aab7-45f7-986b-dbbe1c1edd39" (UID: "087a51db-aab7-45f7-986b-dbbe1c1edd39"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.871821 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087a51db-aab7-45f7-986b-dbbe1c1edd39-kube-api-access-2xxv2" (OuterVolumeSpecName: "kube-api-access-2xxv2") pod "087a51db-aab7-45f7-986b-dbbe1c1edd39" (UID: "087a51db-aab7-45f7-986b-dbbe1c1edd39"). InnerVolumeSpecName "kube-api-access-2xxv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.872138 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-scripts" (OuterVolumeSpecName: "scripts") pod "087a51db-aab7-45f7-986b-dbbe1c1edd39" (UID: "087a51db-aab7-45f7-986b-dbbe1c1edd39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.901816 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "087a51db-aab7-45f7-986b-dbbe1c1edd39" (UID: "087a51db-aab7-45f7-986b-dbbe1c1edd39"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.914761 4763 scope.go:117] "RemoveContainer" containerID="555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.967538 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xxv2\" (UniqueName: \"kubernetes.io/projected/087a51db-aab7-45f7-986b-dbbe1c1edd39-kube-api-access-2xxv2\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.967583 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.967613 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.967629 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:48 crc kubenswrapper[4763]: I0930 13:55:48.967641 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/087a51db-aab7-45f7-986b-dbbe1c1edd39-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.006302 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:55:49 crc kubenswrapper[4763]: W0930 13:55:49.038402 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba159e27_7a3b_4b90_a7db_de6135f8153c.slice/crio-5e9262290f06c0b049b8c4a763dd2786af46c0daf94e2c034bd9a8071f39cfc3 WatchSource:0}: Error finding container 5e9262290f06c0b049b8c4a763dd2786af46c0daf94e2c034bd9a8071f39cfc3: Status 404 returned error can't find the container with id 5e9262290f06c0b049b8c4a763dd2786af46c0daf94e2c034bd9a8071f39cfc3 Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.044701 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "087a51db-aab7-45f7-986b-dbbe1c1edd39" (UID: "087a51db-aab7-45f7-986b-dbbe1c1edd39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.057789 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-config-data" (OuterVolumeSpecName: "config-data") pod "087a51db-aab7-45f7-986b-dbbe1c1edd39" (UID: "087a51db-aab7-45f7-986b-dbbe1c1edd39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.069066 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.069124 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087a51db-aab7-45f7-986b-dbbe1c1edd39-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.162409 4763 scope.go:117] "RemoveContainer" containerID="305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.168080 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.183388 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.219944 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:49 crc kubenswrapper[4763]: E0930 13:55:49.220404 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="ceilometer-notification-agent" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.220429 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="ceilometer-notification-agent" Sep 30 13:55:49 crc kubenswrapper[4763]: E0930 13:55:49.220454 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="proxy-httpd" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.220464 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="proxy-httpd" Sep 30 13:55:49 crc kubenswrapper[4763]: E0930 13:55:49.220492 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="ceilometer-central-agent" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.220499 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="ceilometer-central-agent" Sep 30 13:55:49 crc kubenswrapper[4763]: E0930 13:55:49.220518 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="sg-core" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.220526 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="sg-core" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.220743 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="ceilometer-central-agent" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.220757 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="proxy-httpd" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.220779 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="sg-core" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.220793 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" containerName="ceilometer-notification-agent" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.223363 4763 scope.go:117] "RemoveContainer" containerID="2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.226846 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: E0930 13:55:49.228773 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28\": container with ID starting with 2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28 not found: ID does not exist" containerID="2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.228816 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28"} err="failed to get container status \"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28\": rpc error: code = NotFound desc = could not find container \"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28\": container with ID starting with 2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.228843 4763 scope.go:117] "RemoveContainer" containerID="3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31" Sep 30 13:55:49 crc kubenswrapper[4763]: E0930 13:55:49.230159 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31\": container with ID starting with 3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31 not found: ID does not exist" containerID="3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.230190 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31"} err="failed to get container status \"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31\": rpc error: code = NotFound desc = could not find container \"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31\": container with ID starting with 3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.230209 4763 scope.go:117] "RemoveContainer" containerID="555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.230487 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.230568 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:55:49 crc kubenswrapper[4763]: E0930 13:55:49.230891 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944\": container with ID starting with 555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944 not found: ID does not exist" containerID="555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.230928 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944"} err="failed to get container status \"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944\": rpc error: code = NotFound desc = could not find container \"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944\": container with ID starting with 555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.230952 4763 scope.go:117] "RemoveContainer" containerID="305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.232006 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:49 crc kubenswrapper[4763]: E0930 13:55:49.232509 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4\": container with ID starting with 305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4 not found: ID does not exist" containerID="305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.232540 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4"} err="failed to get container status \"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4\": rpc error: code = NotFound desc = could not find container \"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4\": container with ID starting with 305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.232558 4763 scope.go:117] "RemoveContainer" containerID="2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.233212 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28"} err="failed to get container status \"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28\": rpc error: code = NotFound desc = could not find container \"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28\": container with ID starting with 2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.233236 4763 scope.go:117] "RemoveContainer" containerID="3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.235577 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31"} err="failed to get container status \"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31\": rpc error: code = NotFound desc = could not find container \"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31\": container with ID starting with 3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.235630 4763 scope.go:117] "RemoveContainer" containerID="555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.236795 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944"} err="failed to get container status \"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944\": rpc error: code = NotFound desc = could not find container \"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944\": container with ID starting with 555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.236825 4763 scope.go:117] "RemoveContainer" containerID="305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.237569 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4"} err="failed to get container status \"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4\": rpc error: code = NotFound desc = could not find container \"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4\": container with ID starting with 305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.237711 4763 scope.go:117] "RemoveContainer" containerID="2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.238105 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28"} err="failed to get container status \"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28\": rpc error: code = NotFound desc = could not find container \"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28\": container with ID starting with 2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.238129 4763 scope.go:117] "RemoveContainer" containerID="3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.238553 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31"} err="failed to get container status \"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31\": rpc error: code = NotFound desc = could not find container \"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31\": container with ID starting with 3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.238581 4763 scope.go:117] "RemoveContainer" containerID="555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.239098 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944"} err="failed to get container status \"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944\": rpc error: code = NotFound desc = could not find container \"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944\": container with ID starting with 555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.239126 4763 scope.go:117] "RemoveContainer" containerID="305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.239672 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4"} err="failed to get container status \"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4\": rpc error: code = NotFound desc = could not find container \"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4\": container with ID starting with 305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.239697 4763 scope.go:117] "RemoveContainer" containerID="2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.240561 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28"} err="failed to get container status \"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28\": rpc error: code = NotFound desc = could not find container \"2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28\": container with ID starting with 2d3d1816a7c0e7dc3e2e726156d79fefc4472e509d0d6701b6c9efbb505c4d28 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.240583 4763 scope.go:117] "RemoveContainer" containerID="3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.242056 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31"} err="failed to get container status \"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31\": rpc error: code = NotFound desc = could not find container \"3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31\": container with ID starting with 3b9a62de4e115e9c43d20b091cd90e76b1b9a5e17ec95013f72f61d072733b31 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.242095 4763 scope.go:117] "RemoveContainer" containerID="555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.261268 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944"} err="failed to get container status \"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944\": rpc error: code = NotFound desc = could not find container \"555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944\": container with ID starting with 555f46419e1c41817586b0c462166e95a3db675d6cb852c650068accfa496944 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.261334 4763 scope.go:117] "RemoveContainer" containerID="305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.263216 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4"} err="failed to get container status \"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4\": rpc error: code = NotFound desc = could not find container \"305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4\": container with ID starting with 305271e414342b926090bb004b9df69aabb68216d4692192f0639bc36bf0eea4 not found: ID does not exist" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.277078 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fcw4\" (UniqueName: \"kubernetes.io/projected/534a331d-5889-41bf-b2ac-a32e9d7886e6-kube-api-access-5fcw4\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.277121 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.277153 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-config-data\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.277178 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-scripts\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.277193 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-log-httpd\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.277219 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.277262 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-run-httpd\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.278181 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bbaf-account-create-dzmfz" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.378851 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fcw4\" (UniqueName: \"kubernetes.io/projected/534a331d-5889-41bf-b2ac-a32e9d7886e6-kube-api-access-5fcw4\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.378914 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.378958 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-config-data\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.378987 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-scripts\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.379006 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-log-httpd\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.379037 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.379096 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-run-httpd\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.379662 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-run-httpd\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.380685 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-log-httpd\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.389945 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.390263 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-config-data\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.394989 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-scripts\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.396379 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.407223 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fcw4\" (UniqueName: \"kubernetes.io/projected/534a331d-5889-41bf-b2ac-a32e9d7886e6-kube-api-access-5fcw4\") pod \"ceilometer-0\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.431702 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4d85-account-create-92v9x" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.451148 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-903f-account-create-j2fds" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.483815 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wnv2\" (UniqueName: \"kubernetes.io/projected/56d7990d-14a8-4872-ad27-85dc01c63f23-kube-api-access-7wnv2\") pod \"56d7990d-14a8-4872-ad27-85dc01c63f23\" (UID: \"56d7990d-14a8-4872-ad27-85dc01c63f23\") " Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.483943 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sgdq\" (UniqueName: \"kubernetes.io/projected/7f4dee33-3e98-4741-9001-bb28578a2a24-kube-api-access-2sgdq\") pod \"7f4dee33-3e98-4741-9001-bb28578a2a24\" (UID: \"7f4dee33-3e98-4741-9001-bb28578a2a24\") " Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.484486 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvsh6\" (UniqueName: \"kubernetes.io/projected/64ed31b0-29b4-49fb-9a0b-0c07ef07706c-kube-api-access-wvsh6\") pod \"64ed31b0-29b4-49fb-9a0b-0c07ef07706c\" (UID: \"64ed31b0-29b4-49fb-9a0b-0c07ef07706c\") " Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.487064 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d7990d-14a8-4872-ad27-85dc01c63f23-kube-api-access-7wnv2" (OuterVolumeSpecName: "kube-api-access-7wnv2") pod "56d7990d-14a8-4872-ad27-85dc01c63f23" (UID: "56d7990d-14a8-4872-ad27-85dc01c63f23"). InnerVolumeSpecName "kube-api-access-7wnv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.487905 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ed31b0-29b4-49fb-9a0b-0c07ef07706c-kube-api-access-wvsh6" (OuterVolumeSpecName: "kube-api-access-wvsh6") pod "64ed31b0-29b4-49fb-9a0b-0c07ef07706c" (UID: "64ed31b0-29b4-49fb-9a0b-0c07ef07706c"). InnerVolumeSpecName "kube-api-access-wvsh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.489806 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4dee33-3e98-4741-9001-bb28578a2a24-kube-api-access-2sgdq" (OuterVolumeSpecName: "kube-api-access-2sgdq") pod "7f4dee33-3e98-4741-9001-bb28578a2a24" (UID: "7f4dee33-3e98-4741-9001-bb28578a2a24"). InnerVolumeSpecName "kube-api-access-2sgdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.560374 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.586840 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wnv2\" (UniqueName: \"kubernetes.io/projected/56d7990d-14a8-4872-ad27-85dc01c63f23-kube-api-access-7wnv2\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.586869 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sgdq\" (UniqueName: \"kubernetes.io/projected/7f4dee33-3e98-4741-9001-bb28578a2a24-kube-api-access-2sgdq\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.586878 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvsh6\" (UniqueName: \"kubernetes.io/projected/64ed31b0-29b4-49fb-9a0b-0c07ef07706c-kube-api-access-wvsh6\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.828838 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba159e27-7a3b-4b90-a7db-de6135f8153c","Type":"ContainerStarted","Data":"5e9262290f06c0b049b8c4a763dd2786af46c0daf94e2c034bd9a8071f39cfc3"} Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.839046 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4d85-account-create-92v9x" event={"ID":"7f4dee33-3e98-4741-9001-bb28578a2a24","Type":"ContainerDied","Data":"52ef8f40118a935b85e04f31eaf48d59536fa213177d49541e03fa985d9e1e9c"} Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.839329 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52ef8f40118a935b85e04f31eaf48d59536fa213177d49541e03fa985d9e1e9c" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.839417 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4d85-account-create-92v9x" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.843772 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bbaf-account-create-dzmfz" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.844734 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bbaf-account-create-dzmfz" event={"ID":"64ed31b0-29b4-49fb-9a0b-0c07ef07706c","Type":"ContainerDied","Data":"6f96078611c8deca34587d9c7e0bd217a1551a5ef56170200b8a0dbe9ef7b2b0"} Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.844781 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f96078611c8deca34587d9c7e0bd217a1551a5ef56170200b8a0dbe9ef7b2b0" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.846528 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-903f-account-create-j2fds" Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.846644 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-903f-account-create-j2fds" event={"ID":"56d7990d-14a8-4872-ad27-85dc01c63f23","Type":"ContainerDied","Data":"8f33d957696aa8eca4f61f6b97ac2d787756d76fae6834c257f5109f5fa39304"} Sep 30 13:55:49 crc kubenswrapper[4763]: I0930 13:55:49.846674 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f33d957696aa8eca4f61f6b97ac2d787756d76fae6834c257f5109f5fa39304" Sep 30 13:55:50 crc kubenswrapper[4763]: I0930 13:55:50.036834 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:50 crc kubenswrapper[4763]: I0930 13:55:50.238210 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:55:50 crc kubenswrapper[4763]: I0930 13:55:50.504212 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="087a51db-aab7-45f7-986b-dbbe1c1edd39" path="/var/lib/kubelet/pods/087a51db-aab7-45f7-986b-dbbe1c1edd39/volumes" Sep 30 13:55:50 crc kubenswrapper[4763]: I0930 13:55:50.864144 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"534a331d-5889-41bf-b2ac-a32e9d7886e6","Type":"ContainerStarted","Data":"34cc5f8645eb6bb3b1baf9d3c26a8bdc58553638730002736f46ea5a7e163876"} Sep 30 13:55:50 crc kubenswrapper[4763]: I0930 13:55:50.864469 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"534a331d-5889-41bf-b2ac-a32e9d7886e6","Type":"ContainerStarted","Data":"987da1f8594b193315ff887a90e5544b35313f4c276383b9b980c46319ec82c2"} Sep 30 13:55:50 crc kubenswrapper[4763]: I0930 13:55:50.869296 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba159e27-7a3b-4b90-a7db-de6135f8153c","Type":"ContainerStarted","Data":"1b8f0928a35a4ae56d6ef0cb85281920dd3c4a313f493d020961d40d139fa47b"} Sep 30 13:55:50 crc kubenswrapper[4763]: I0930 13:55:50.869345 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba159e27-7a3b-4b90-a7db-de6135f8153c","Type":"ContainerStarted","Data":"cb40ecb42c9c7e13873d46e9437c88735cec11180b4edf02393f88c403b8189b"} Sep 30 13:55:50 crc kubenswrapper[4763]: I0930 13:55:50.869348 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:55:50 crc kubenswrapper[4763]: I0930 13:55:50.869368 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.025317 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.028738 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.049624 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.049590617 podStartE2EDuration="4.049590617s" podCreationTimestamp="2025-09-30 13:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:50.915268366 +0000 UTC m=+1223.053828661" watchObservedRunningTime="2025-09-30 13:55:51.049590617 +0000 UTC m=+1223.188150902" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.237106 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mgpxr"] Sep 30 13:55:51 crc kubenswrapper[4763]: E0930 13:55:51.237883 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4dee33-3e98-4741-9001-bb28578a2a24" containerName="mariadb-account-create" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.237903 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4dee33-3e98-4741-9001-bb28578a2a24" containerName="mariadb-account-create" Sep 30 13:55:51 crc kubenswrapper[4763]: E0930 13:55:51.237921 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d7990d-14a8-4872-ad27-85dc01c63f23" containerName="mariadb-account-create" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.237930 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d7990d-14a8-4872-ad27-85dc01c63f23" containerName="mariadb-account-create" Sep 30 13:55:51 crc kubenswrapper[4763]: E0930 13:55:51.237940 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ed31b0-29b4-49fb-9a0b-0c07ef07706c" containerName="mariadb-account-create" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.237948 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ed31b0-29b4-49fb-9a0b-0c07ef07706c" containerName="mariadb-account-create" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.238170 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4dee33-3e98-4741-9001-bb28578a2a24" containerName="mariadb-account-create" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.238193 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ed31b0-29b4-49fb-9a0b-0c07ef07706c" containerName="mariadb-account-create" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.238206 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d7990d-14a8-4872-ad27-85dc01c63f23" containerName="mariadb-account-create" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.238924 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.243041 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.243976 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nfrc5" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.243988 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.246409 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mgpxr"] Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.325757 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-scripts\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.325800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.325831 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-config-data\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.325923 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pj4z\" (UniqueName: \"kubernetes.io/projected/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-kube-api-access-7pj4z\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.427653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pj4z\" (UniqueName: \"kubernetes.io/projected/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-kube-api-access-7pj4z\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.427765 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-scripts\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.427798 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.427841 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-config-data\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.431222 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.431590 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-scripts\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.435109 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-config-data\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.450205 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pj4z\" (UniqueName: \"kubernetes.io/projected/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-kube-api-access-7pj4z\") pod \"nova-cell0-conductor-db-sync-mgpxr\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.558318 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:55:51 crc kubenswrapper[4763]: I0930 13:55:51.879701 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"534a331d-5889-41bf-b2ac-a32e9d7886e6","Type":"ContainerStarted","Data":"5f33a5495a465f62d19c03e012154de2759261c741e47dac4d505ae9c89beab0"} Sep 30 13:55:52 crc kubenswrapper[4763]: I0930 13:55:52.046189 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mgpxr"] Sep 30 13:55:52 crc kubenswrapper[4763]: W0930 13:55:52.048273 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3e417ea_91e0_4cb5_baf8_d5c6bd758ae7.slice/crio-c27456c198fc8cb3fd14ceb87d736218d9efd3e280a89cbc9631b492e3a1ea4f WatchSource:0}: Error finding container c27456c198fc8cb3fd14ceb87d736218d9efd3e280a89cbc9631b492e3a1ea4f: Status 404 returned error can't find the container with id c27456c198fc8cb3fd14ceb87d736218d9efd3e280a89cbc9631b492e3a1ea4f Sep 30 13:55:52 crc kubenswrapper[4763]: I0930 13:55:52.899782 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"534a331d-5889-41bf-b2ac-a32e9d7886e6","Type":"ContainerStarted","Data":"754dda910f1c0e1b11ffffc812452889aa0064b209f6d17f6a3da004802ae0c0"} Sep 30 13:55:52 crc kubenswrapper[4763]: I0930 13:55:52.901684 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mgpxr" event={"ID":"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7","Type":"ContainerStarted","Data":"c27456c198fc8cb3fd14ceb87d736218d9efd3e280a89cbc9631b492e3a1ea4f"} Sep 30 13:55:53 crc kubenswrapper[4763]: I0930 13:55:53.276137 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 13:55:53 crc kubenswrapper[4763]: I0930 13:55:53.914682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"534a331d-5889-41bf-b2ac-a32e9d7886e6","Type":"ContainerStarted","Data":"abab948542319c56ee3d94cc430c87c9f040d1a38a7ef3d4e90c9768a2581296"} Sep 30 13:55:53 crc kubenswrapper[4763]: I0930 13:55:53.914918 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="proxy-httpd" containerID="cri-o://abab948542319c56ee3d94cc430c87c9f040d1a38a7ef3d4e90c9768a2581296" gracePeriod=30 Sep 30 13:55:53 crc kubenswrapper[4763]: I0930 13:55:53.914942 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="ceilometer-notification-agent" containerID="cri-o://5f33a5495a465f62d19c03e012154de2759261c741e47dac4d505ae9c89beab0" gracePeriod=30 Sep 30 13:55:53 crc kubenswrapper[4763]: I0930 13:55:53.914936 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="sg-core" containerID="cri-o://754dda910f1c0e1b11ffffc812452889aa0064b209f6d17f6a3da004802ae0c0" gracePeriod=30 Sep 30 13:55:53 crc kubenswrapper[4763]: I0930 13:55:53.915170 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:55:53 crc kubenswrapper[4763]: I0930 13:55:53.916236 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="ceilometer-central-agent" containerID="cri-o://34cc5f8645eb6bb3b1baf9d3c26a8bdc58553638730002736f46ea5a7e163876" gracePeriod=30 Sep 30 13:55:53 crc kubenswrapper[4763]: I0930 13:55:53.948997 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6277064399999999 podStartE2EDuration="4.948977002s" podCreationTimestamp="2025-09-30 13:55:49 +0000 UTC" firstStartedPulling="2025-09-30 13:55:50.072052665 +0000 UTC m=+1222.210612950" lastFinishedPulling="2025-09-30 13:55:53.393323237 +0000 UTC m=+1225.531883512" observedRunningTime="2025-09-30 13:55:53.936973772 +0000 UTC m=+1226.075534057" watchObservedRunningTime="2025-09-30 13:55:53.948977002 +0000 UTC m=+1226.087537287" Sep 30 13:55:54 crc kubenswrapper[4763]: I0930 13:55:54.924192 4763 generic.go:334] "Generic (PLEG): container finished" podID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerID="abab948542319c56ee3d94cc430c87c9f040d1a38a7ef3d4e90c9768a2581296" exitCode=0 Sep 30 13:55:54 crc kubenswrapper[4763]: I0930 13:55:54.924431 4763 generic.go:334] "Generic (PLEG): container finished" podID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerID="754dda910f1c0e1b11ffffc812452889aa0064b209f6d17f6a3da004802ae0c0" exitCode=2 Sep 30 13:55:54 crc kubenswrapper[4763]: I0930 13:55:54.924441 4763 generic.go:334] "Generic (PLEG): container finished" podID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerID="5f33a5495a465f62d19c03e012154de2759261c741e47dac4d505ae9c89beab0" exitCode=0 Sep 30 13:55:54 crc kubenswrapper[4763]: I0930 13:55:54.924256 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"534a331d-5889-41bf-b2ac-a32e9d7886e6","Type":"ContainerDied","Data":"abab948542319c56ee3d94cc430c87c9f040d1a38a7ef3d4e90c9768a2581296"} Sep 30 13:55:54 crc kubenswrapper[4763]: I0930 13:55:54.924473 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"534a331d-5889-41bf-b2ac-a32e9d7886e6","Type":"ContainerDied","Data":"754dda910f1c0e1b11ffffc812452889aa0064b209f6d17f6a3da004802ae0c0"} Sep 30 13:55:54 crc kubenswrapper[4763]: I0930 13:55:54.924486 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"534a331d-5889-41bf-b2ac-a32e9d7886e6","Type":"ContainerDied","Data":"5f33a5495a465f62d19c03e012154de2759261c741e47dac4d505ae9c89beab0"} Sep 30 13:55:56 crc kubenswrapper[4763]: I0930 13:55:56.764095 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:55:58 crc kubenswrapper[4763]: I0930 13:55:58.281131 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 13:55:58 crc kubenswrapper[4763]: I0930 13:55:58.281636 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 13:55:58 crc kubenswrapper[4763]: I0930 13:55:58.318506 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 13:55:58 crc kubenswrapper[4763]: I0930 13:55:58.322466 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 13:55:58 crc kubenswrapper[4763]: I0930 13:55:58.967579 4763 generic.go:334] "Generic (PLEG): container finished" podID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerID="34cc5f8645eb6bb3b1baf9d3c26a8bdc58553638730002736f46ea5a7e163876" exitCode=0 Sep 30 13:55:58 crc kubenswrapper[4763]: I0930 13:55:58.967623 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"534a331d-5889-41bf-b2ac-a32e9d7886e6","Type":"ContainerDied","Data":"34cc5f8645eb6bb3b1baf9d3c26a8bdc58553638730002736f46ea5a7e163876"} Sep 30 13:55:58 crc kubenswrapper[4763]: I0930 13:55:58.968237 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 13:55:58 crc kubenswrapper[4763]: I0930 13:55:58.968268 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.054342 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.195029 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-run-httpd\") pod \"534a331d-5889-41bf-b2ac-a32e9d7886e6\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.195376 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fcw4\" (UniqueName: \"kubernetes.io/projected/534a331d-5889-41bf-b2ac-a32e9d7886e6-kube-api-access-5fcw4\") pod \"534a331d-5889-41bf-b2ac-a32e9d7886e6\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.195403 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-config-data\") pod \"534a331d-5889-41bf-b2ac-a32e9d7886e6\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.195468 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-scripts\") pod \"534a331d-5889-41bf-b2ac-a32e9d7886e6\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.195623 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-log-httpd\") pod \"534a331d-5889-41bf-b2ac-a32e9d7886e6\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.195686 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-combined-ca-bundle\") pod \"534a331d-5889-41bf-b2ac-a32e9d7886e6\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.195739 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-sg-core-conf-yaml\") pod \"534a331d-5889-41bf-b2ac-a32e9d7886e6\" (UID: \"534a331d-5889-41bf-b2ac-a32e9d7886e6\") " Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.196051 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "534a331d-5889-41bf-b2ac-a32e9d7886e6" (UID: "534a331d-5889-41bf-b2ac-a32e9d7886e6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.196224 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.196646 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "534a331d-5889-41bf-b2ac-a32e9d7886e6" (UID: "534a331d-5889-41bf-b2ac-a32e9d7886e6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.201247 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534a331d-5889-41bf-b2ac-a32e9d7886e6-kube-api-access-5fcw4" (OuterVolumeSpecName: "kube-api-access-5fcw4") pod "534a331d-5889-41bf-b2ac-a32e9d7886e6" (UID: "534a331d-5889-41bf-b2ac-a32e9d7886e6"). InnerVolumeSpecName "kube-api-access-5fcw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.201779 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-scripts" (OuterVolumeSpecName: "scripts") pod "534a331d-5889-41bf-b2ac-a32e9d7886e6" (UID: "534a331d-5889-41bf-b2ac-a32e9d7886e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.222965 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "534a331d-5889-41bf-b2ac-a32e9d7886e6" (UID: "534a331d-5889-41bf-b2ac-a32e9d7886e6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.277314 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "534a331d-5889-41bf-b2ac-a32e9d7886e6" (UID: "534a331d-5889-41bf-b2ac-a32e9d7886e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.287905 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-config-data" (OuterVolumeSpecName: "config-data") pod "534a331d-5889-41bf-b2ac-a32e9d7886e6" (UID: "534a331d-5889-41bf-b2ac-a32e9d7886e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.297547 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fcw4\" (UniqueName: \"kubernetes.io/projected/534a331d-5889-41bf-b2ac-a32e9d7886e6-kube-api-access-5fcw4\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.297693 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.297707 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.297717 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/534a331d-5889-41bf-b2ac-a32e9d7886e6-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.297731 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.297742 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/534a331d-5889-41bf-b2ac-a32e9d7886e6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.982565 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"534a331d-5889-41bf-b2ac-a32e9d7886e6","Type":"ContainerDied","Data":"987da1f8594b193315ff887a90e5544b35313f4c276383b9b980c46319ec82c2"} Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.983091 4763 scope.go:117] "RemoveContainer" containerID="abab948542319c56ee3d94cc430c87c9f040d1a38a7ef3d4e90c9768a2581296" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.982581 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:55:59 crc kubenswrapper[4763]: I0930 13:55:59.985447 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mgpxr" event={"ID":"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7","Type":"ContainerStarted","Data":"9f293e39c482f59dc3fed18cd673e0ab7b3a1b249fcc80c88a3973a815f36bca"} Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.016085 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mgpxr" podStartSLOduration=2.296814075 podStartE2EDuration="9.016064125s" podCreationTimestamp="2025-09-30 13:55:51 +0000 UTC" firstStartedPulling="2025-09-30 13:55:52.050303962 +0000 UTC m=+1224.188864247" lastFinishedPulling="2025-09-30 13:55:58.769554012 +0000 UTC m=+1230.908114297" observedRunningTime="2025-09-30 13:56:00.00788131 +0000 UTC m=+1232.146441605" watchObservedRunningTime="2025-09-30 13:56:00.016064125 +0000 UTC m=+1232.154624410" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.041889 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.042251 4763 scope.go:117] "RemoveContainer" containerID="754dda910f1c0e1b11ffffc812452889aa0064b209f6d17f6a3da004802ae0c0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.060339 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.071271 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:00 crc kubenswrapper[4763]: E0930 13:56:00.071741 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="ceilometer-notification-agent" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.071766 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="ceilometer-notification-agent" Sep 30 13:56:00 crc kubenswrapper[4763]: E0930 13:56:00.071799 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="ceilometer-central-agent" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.071808 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="ceilometer-central-agent" Sep 30 13:56:00 crc kubenswrapper[4763]: E0930 13:56:00.071838 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="proxy-httpd" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.071846 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="proxy-httpd" Sep 30 13:56:00 crc kubenswrapper[4763]: E0930 13:56:00.071864 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="sg-core" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.071871 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="sg-core" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.072099 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="ceilometer-central-agent" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.072120 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="sg-core" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.072150 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="ceilometer-notification-agent" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.072167 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" containerName="proxy-httpd" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.074205 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.075646 4763 scope.go:117] "RemoveContainer" containerID="5f33a5495a465f62d19c03e012154de2759261c741e47dac4d505ae9c89beab0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.079668 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.079869 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.081828 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.125941 4763 scope.go:117] "RemoveContainer" containerID="34cc5f8645eb6bb3b1baf9d3c26a8bdc58553638730002736f46ea5a7e163876" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.222934 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.223340 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-run-httpd\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.223511 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-scripts\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.223623 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-log-httpd\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.223731 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbcxd\" (UniqueName: \"kubernetes.io/projected/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-kube-api-access-vbcxd\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.223823 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.223934 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-config-data\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.325972 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-scripts\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.326568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-log-httpd\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.326692 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbcxd\" (UniqueName: \"kubernetes.io/projected/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-kube-api-access-vbcxd\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.326777 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.326889 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-config-data\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.327001 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.327157 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-run-httpd\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.327150 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-log-httpd\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.327707 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-run-httpd\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.332500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.339724 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-config-data\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.340202 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.340726 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-scripts\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.349015 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbcxd\" (UniqueName: \"kubernetes.io/projected/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-kube-api-access-vbcxd\") pod \"ceilometer-0\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.401662 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.511087 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="534a331d-5889-41bf-b2ac-a32e9d7886e6" path="/var/lib/kubelet/pods/534a331d-5889-41bf-b2ac-a32e9d7886e6/volumes" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.824349 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.911378 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.957872 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 13:56:00 crc kubenswrapper[4763]: I0930 13:56:00.996192 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc","Type":"ContainerStarted","Data":"e6994d25a1dac58897ef4e07d7cebfb95c9aaa20835e21bc4e231582bfc7b27a"} Sep 30 13:56:02 crc kubenswrapper[4763]: I0930 13:56:02.009831 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc","Type":"ContainerStarted","Data":"5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da"} Sep 30 13:56:02 crc kubenswrapper[4763]: I0930 13:56:02.895069 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:56:02 crc kubenswrapper[4763]: I0930 13:56:02.974040 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-576df9b9d8-5btc5"] Sep 30 13:56:02 crc kubenswrapper[4763]: I0930 13:56:02.974294 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-576df9b9d8-5btc5" podUID="fc3e6347-c27f-4249-a1b7-145165c06d70" containerName="neutron-api" containerID="cri-o://4774b3f6ac7b157b234c17b092bac7e0ad012e21d52b9da28446843481c35238" gracePeriod=30 Sep 30 13:56:02 crc kubenswrapper[4763]: I0930 13:56:02.974435 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-576df9b9d8-5btc5" podUID="fc3e6347-c27f-4249-a1b7-145165c06d70" containerName="neutron-httpd" containerID="cri-o://b854e79c47c6f6e756da69e22feb04c256f5f94040230f706fcdea5ac2ac8dc0" gracePeriod=30 Sep 30 13:56:03 crc kubenswrapper[4763]: I0930 13:56:03.029311 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc","Type":"ContainerStarted","Data":"1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426"} Sep 30 13:56:04 crc kubenswrapper[4763]: I0930 13:56:04.041467 4763 generic.go:334] "Generic (PLEG): container finished" podID="fc3e6347-c27f-4249-a1b7-145165c06d70" containerID="b854e79c47c6f6e756da69e22feb04c256f5f94040230f706fcdea5ac2ac8dc0" exitCode=0 Sep 30 13:56:04 crc kubenswrapper[4763]: I0930 13:56:04.041523 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576df9b9d8-5btc5" event={"ID":"fc3e6347-c27f-4249-a1b7-145165c06d70","Type":"ContainerDied","Data":"b854e79c47c6f6e756da69e22feb04c256f5f94040230f706fcdea5ac2ac8dc0"} Sep 30 13:56:06 crc kubenswrapper[4763]: I0930 13:56:06.059492 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc","Type":"ContainerStarted","Data":"546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873"} Sep 30 13:56:08 crc kubenswrapper[4763]: I0930 13:56:08.083242 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc","Type":"ContainerStarted","Data":"79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518"} Sep 30 13:56:08 crc kubenswrapper[4763]: I0930 13:56:08.083674 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:56:08 crc kubenswrapper[4763]: I0930 13:56:08.107658 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.89681743 podStartE2EDuration="8.107640341s" podCreationTimestamp="2025-09-30 13:56:00 +0000 UTC" firstStartedPulling="2025-09-30 13:56:00.82913038 +0000 UTC m=+1232.967690665" lastFinishedPulling="2025-09-30 13:56:07.039953291 +0000 UTC m=+1239.178513576" observedRunningTime="2025-09-30 13:56:08.105547708 +0000 UTC m=+1240.244107993" watchObservedRunningTime="2025-09-30 13:56:08.107640341 +0000 UTC m=+1240.246200636" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.094550 4763 generic.go:334] "Generic (PLEG): container finished" podID="fc3e6347-c27f-4249-a1b7-145165c06d70" containerID="4774b3f6ac7b157b234c17b092bac7e0ad012e21d52b9da28446843481c35238" exitCode=0 Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.094642 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576df9b9d8-5btc5" event={"ID":"fc3e6347-c27f-4249-a1b7-145165c06d70","Type":"ContainerDied","Data":"4774b3f6ac7b157b234c17b092bac7e0ad012e21d52b9da28446843481c35238"} Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.094928 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576df9b9d8-5btc5" event={"ID":"fc3e6347-c27f-4249-a1b7-145165c06d70","Type":"ContainerDied","Data":"e5e7729ce3ef860ab0eca1555d7259d6a5075e8f2fe0fe815a41108915f6d066"} Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.094946 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5e7729ce3ef860ab0eca1555d7259d6a5075e8f2fe0fe815a41108915f6d066" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.138878 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.200060 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-combined-ca-bundle\") pod \"fc3e6347-c27f-4249-a1b7-145165c06d70\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.200122 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-config\") pod \"fc3e6347-c27f-4249-a1b7-145165c06d70\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.200225 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26sn2\" (UniqueName: \"kubernetes.io/projected/fc3e6347-c27f-4249-a1b7-145165c06d70-kube-api-access-26sn2\") pod \"fc3e6347-c27f-4249-a1b7-145165c06d70\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.200259 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-ovndb-tls-certs\") pod \"fc3e6347-c27f-4249-a1b7-145165c06d70\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.200366 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-httpd-config\") pod \"fc3e6347-c27f-4249-a1b7-145165c06d70\" (UID: \"fc3e6347-c27f-4249-a1b7-145165c06d70\") " Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.206223 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fc3e6347-c27f-4249-a1b7-145165c06d70" (UID: "fc3e6347-c27f-4249-a1b7-145165c06d70"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.206226 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3e6347-c27f-4249-a1b7-145165c06d70-kube-api-access-26sn2" (OuterVolumeSpecName: "kube-api-access-26sn2") pod "fc3e6347-c27f-4249-a1b7-145165c06d70" (UID: "fc3e6347-c27f-4249-a1b7-145165c06d70"). InnerVolumeSpecName "kube-api-access-26sn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.246222 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc3e6347-c27f-4249-a1b7-145165c06d70" (UID: "fc3e6347-c27f-4249-a1b7-145165c06d70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.246481 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-config" (OuterVolumeSpecName: "config") pod "fc3e6347-c27f-4249-a1b7-145165c06d70" (UID: "fc3e6347-c27f-4249-a1b7-145165c06d70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.272313 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fc3e6347-c27f-4249-a1b7-145165c06d70" (UID: "fc3e6347-c27f-4249-a1b7-145165c06d70"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.303425 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.303515 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.303532 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.303579 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26sn2\" (UniqueName: \"kubernetes.io/projected/fc3e6347-c27f-4249-a1b7-145165c06d70-kube-api-access-26sn2\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:09 crc kubenswrapper[4763]: I0930 13:56:09.303614 4763 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3e6347-c27f-4249-a1b7-145165c06d70-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:10 crc kubenswrapper[4763]: I0930 13:56:10.105874 4763 generic.go:334] "Generic (PLEG): container finished" podID="c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7" containerID="9f293e39c482f59dc3fed18cd673e0ab7b3a1b249fcc80c88a3973a815f36bca" exitCode=0 Sep 30 13:56:10 crc kubenswrapper[4763]: I0930 13:56:10.105964 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mgpxr" event={"ID":"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7","Type":"ContainerDied","Data":"9f293e39c482f59dc3fed18cd673e0ab7b3a1b249fcc80c88a3973a815f36bca"} Sep 30 13:56:10 crc kubenswrapper[4763]: I0930 13:56:10.107003 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576df9b9d8-5btc5" Sep 30 13:56:10 crc kubenswrapper[4763]: I0930 13:56:10.141727 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-576df9b9d8-5btc5"] Sep 30 13:56:10 crc kubenswrapper[4763]: I0930 13:56:10.150068 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-576df9b9d8-5btc5"] Sep 30 13:56:10 crc kubenswrapper[4763]: I0930 13:56:10.501461 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3e6347-c27f-4249-a1b7-145165c06d70" path="/var/lib/kubelet/pods/fc3e6347-c27f-4249-a1b7-145165c06d70/volumes" Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.439852 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.539263 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-scripts\") pod \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.539651 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-config-data\") pod \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.539908 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-combined-ca-bundle\") pod \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.539991 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pj4z\" (UniqueName: \"kubernetes.io/projected/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-kube-api-access-7pj4z\") pod \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\" (UID: \"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7\") " Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.544894 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-scripts" (OuterVolumeSpecName: "scripts") pod "c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7" (UID: "c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.545574 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-kube-api-access-7pj4z" (OuterVolumeSpecName: "kube-api-access-7pj4z") pod "c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7" (UID: "c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7"). InnerVolumeSpecName "kube-api-access-7pj4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.568479 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-config-data" (OuterVolumeSpecName: "config-data") pod "c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7" (UID: "c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.570386 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7" (UID: "c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.642171 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.642205 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pj4z\" (UniqueName: \"kubernetes.io/projected/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-kube-api-access-7pj4z\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.642215 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:11 crc kubenswrapper[4763]: I0930 13:56:11.642223 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.126519 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mgpxr" event={"ID":"c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7","Type":"ContainerDied","Data":"c27456c198fc8cb3fd14ceb87d736218d9efd3e280a89cbc9631b492e3a1ea4f"} Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.126562 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c27456c198fc8cb3fd14ceb87d736218d9efd3e280a89cbc9631b492e3a1ea4f" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.126655 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mgpxr" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.208038 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 13:56:12 crc kubenswrapper[4763]: E0930 13:56:12.208485 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3e6347-c27f-4249-a1b7-145165c06d70" containerName="neutron-httpd" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.208500 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3e6347-c27f-4249-a1b7-145165c06d70" containerName="neutron-httpd" Sep 30 13:56:12 crc kubenswrapper[4763]: E0930 13:56:12.208520 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7" containerName="nova-cell0-conductor-db-sync" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.208528 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7" containerName="nova-cell0-conductor-db-sync" Sep 30 13:56:12 crc kubenswrapper[4763]: E0930 13:56:12.208541 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3e6347-c27f-4249-a1b7-145165c06d70" containerName="neutron-api" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.208549 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3e6347-c27f-4249-a1b7-145165c06d70" containerName="neutron-api" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.208763 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3e6347-c27f-4249-a1b7-145165c06d70" containerName="neutron-api" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.208786 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3e6347-c27f-4249-a1b7-145165c06d70" containerName="neutron-httpd" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.208808 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7" containerName="nova-cell0-conductor-db-sync" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.209547 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.211880 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.212311 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nfrc5" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.216453 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.355982 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.356034 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.356282 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cff6\" (UniqueName: \"kubernetes.io/projected/7373f404-a756-4321-bd57-e8d60585abff-kube-api-access-4cff6\") pod \"nova-cell0-conductor-0\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.458137 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.458190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.458278 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cff6\" (UniqueName: \"kubernetes.io/projected/7373f404-a756-4321-bd57-e8d60585abff-kube-api-access-4cff6\") pod \"nova-cell0-conductor-0\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.462582 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.464226 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.473568 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cff6\" (UniqueName: \"kubernetes.io/projected/7373f404-a756-4321-bd57-e8d60585abff-kube-api-access-4cff6\") pod \"nova-cell0-conductor-0\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.525330 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:12 crc kubenswrapper[4763]: I0930 13:56:12.993828 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 13:56:12 crc kubenswrapper[4763]: W0930 13:56:12.998426 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7373f404_a756_4321_bd57_e8d60585abff.slice/crio-782e224536d49e89d38a2969163c696e56ed68e8a16ec5bb32cd6a12094b6e22 WatchSource:0}: Error finding container 782e224536d49e89d38a2969163c696e56ed68e8a16ec5bb32cd6a12094b6e22: Status 404 returned error can't find the container with id 782e224536d49e89d38a2969163c696e56ed68e8a16ec5bb32cd6a12094b6e22 Sep 30 13:56:13 crc kubenswrapper[4763]: I0930 13:56:13.137004 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7373f404-a756-4321-bd57-e8d60585abff","Type":"ContainerStarted","Data":"782e224536d49e89d38a2969163c696e56ed68e8a16ec5bb32cd6a12094b6e22"} Sep 30 13:56:14 crc kubenswrapper[4763]: I0930 13:56:14.147247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7373f404-a756-4321-bd57-e8d60585abff","Type":"ContainerStarted","Data":"605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043"} Sep 30 13:56:14 crc kubenswrapper[4763]: I0930 13:56:14.147656 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:14 crc kubenswrapper[4763]: I0930 13:56:14.165583 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.165560301 podStartE2EDuration="2.165560301s" podCreationTimestamp="2025-09-30 13:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:14.161830168 +0000 UTC m=+1246.300390453" watchObservedRunningTime="2025-09-30 13:56:14.165560301 +0000 UTC m=+1246.304120586" Sep 30 13:56:22 crc kubenswrapper[4763]: I0930 13:56:22.553818 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.089015 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-sn928"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.090707 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.093129 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.102394 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.119591 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-sn928"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.157343 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-config-data\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.157420 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-scripts\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.157442 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwm79\" (UniqueName: \"kubernetes.io/projected/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-kube-api-access-qwm79\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.157502 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.259055 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-config-data\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.259149 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-scripts\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.259182 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwm79\" (UniqueName: \"kubernetes.io/projected/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-kube-api-access-qwm79\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.259230 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.275557 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-config-data\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.275939 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-scripts\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.277230 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.307286 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwm79\" (UniqueName: \"kubernetes.io/projected/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-kube-api-access-qwm79\") pod \"nova-cell0-cell-mapping-sn928\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.313245 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.336360 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.336466 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.359033 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.396621 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.398245 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.403056 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.414806 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.440646 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.464869 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.464943 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55jml\" (UniqueName: \"kubernetes.io/projected/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-kube-api-access-55jml\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.465120 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdwh2\" (UniqueName: \"kubernetes.io/projected/ba4da45e-3335-4db0-b45a-b5270296ee35-kube-api-access-bdwh2\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.465218 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-config-data\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.465402 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-config-data\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.465428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.465527 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-logs\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.465556 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4da45e-3335-4db0-b45a-b5270296ee35-logs\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.470876 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.472564 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.484487 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.551898 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.561367 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.567700 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571260 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55jml\" (UniqueName: \"kubernetes.io/projected/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-kube-api-access-55jml\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571325 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571429 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdwh2\" (UniqueName: \"kubernetes.io/projected/ba4da45e-3335-4db0-b45a-b5270296ee35-kube-api-access-bdwh2\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571482 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-config-data\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mpwf\" (UniqueName: \"kubernetes.io/projected/c8b04d40-51cf-4a31-b70c-bd760d800aca-kube-api-access-4mpwf\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-config-data\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571696 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571809 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-logs\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4da45e-3335-4db0-b45a-b5270296ee35-logs\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.571868 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.578113 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-logs\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.579141 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-config-data\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.583312 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.585933 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4da45e-3335-4db0-b45a-b5270296ee35-logs\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.609099 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.611887 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55jml\" (UniqueName: \"kubernetes.io/projected/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-kube-api-access-55jml\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.619319 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-config-data\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.626336 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.642654 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.644234 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdwh2\" (UniqueName: \"kubernetes.io/projected/ba4da45e-3335-4db0-b45a-b5270296ee35-kube-api-access-bdwh2\") pod \"nova-api-0\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.676711 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.677876 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hcd7\" (UniqueName: \"kubernetes.io/projected/2b436487-c350-46d9-b1b8-75314d1d4605-kube-api-access-8hcd7\") pod \"nova-scheduler-0\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.677964 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.678056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.678083 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-config-data\") pod \"nova-scheduler-0\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.678163 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.678200 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mpwf\" (UniqueName: \"kubernetes.io/projected/c8b04d40-51cf-4a31-b70c-bd760d800aca-kube-api-access-4mpwf\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.684912 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.695848 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.712329 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-4hcqr"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.712627 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mpwf\" (UniqueName: \"kubernetes.io/projected/c8b04d40-51cf-4a31-b70c-bd760d800aca-kube-api-access-4mpwf\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.713925 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.740882 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-4hcqr"] Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.741326 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.789812 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-swift-storage-0\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.789871 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.789897 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pjf2\" (UniqueName: \"kubernetes.io/projected/db68026f-4475-4db7-a942-3ba486623fc5-kube-api-access-2pjf2\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.789938 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.790012 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.790055 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-svc\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.790096 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-config\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.790121 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hcd7\" (UniqueName: \"kubernetes.io/projected/2b436487-c350-46d9-b1b8-75314d1d4605-kube-api-access-8hcd7\") pod \"nova-scheduler-0\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.790220 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-config-data\") pod \"nova-scheduler-0\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.812106 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hcd7\" (UniqueName: \"kubernetes.io/projected/2b436487-c350-46d9-b1b8-75314d1d4605-kube-api-access-8hcd7\") pod \"nova-scheduler-0\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.813849 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-config-data\") pod \"nova-scheduler-0\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.813687 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.895167 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.895231 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pjf2\" (UniqueName: \"kubernetes.io/projected/db68026f-4475-4db7-a942-3ba486623fc5-kube-api-access-2pjf2\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.895266 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.895340 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-svc\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.895378 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-config\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.895569 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-swift-storage-0\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.898715 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-swift-storage-0\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.899186 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.899628 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.913874 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-svc\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.914364 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-config\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.924902 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pjf2\" (UniqueName: \"kubernetes.io/projected/db68026f-4475-4db7-a942-3ba486623fc5-kube-api-access-2pjf2\") pod \"dnsmasq-dns-7d9cc4c77f-4hcqr\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.953998 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:23 crc kubenswrapper[4763]: I0930 13:56:23.954230 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.135735 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.238485 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-sn928"] Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.247725 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lkpxs"] Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.252759 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.257378 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.257540 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.266276 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lkpxs"] Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.383986 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.395563 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:56:24 crc kubenswrapper[4763]: W0930 13:56:24.396206 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c2bfb38_17a1_4e35_a5e1_20144611d2e2.slice/crio-b89af2dd19180896e2bc1030444b672f82d5d1f869a3f2e368156fc6d8d3d488 WatchSource:0}: Error finding container b89af2dd19180896e2bc1030444b672f82d5d1f869a3f2e368156fc6d8d3d488: Status 404 returned error can't find the container with id b89af2dd19180896e2bc1030444b672f82d5d1f869a3f2e368156fc6d8d3d488 Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.405488 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68kwj\" (UniqueName: \"kubernetes.io/projected/df19ca80-6959-485e-b83a-b3c643874684-kube-api-access-68kwj\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.405546 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.405638 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-scripts\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.405743 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-config-data\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.507816 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68kwj\" (UniqueName: \"kubernetes.io/projected/df19ca80-6959-485e-b83a-b3c643874684-kube-api-access-68kwj\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.507884 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.507952 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-scripts\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.508004 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-config-data\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.514415 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.516807 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-config-data\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.519050 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-scripts\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.541449 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68kwj\" (UniqueName: \"kubernetes.io/projected/df19ca80-6959-485e-b83a-b3c643874684-kube-api-access-68kwj\") pod \"nova-cell1-conductor-db-sync-lkpxs\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.564779 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.610838 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:56:24 crc kubenswrapper[4763]: W0930 13:56:24.611700 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8b04d40_51cf_4a31_b70c_bd760d800aca.slice/crio-76594ab690d338ab63d5bc1c07db9e063729a0647426be38bb7052e6c384d75a WatchSource:0}: Error finding container 76594ab690d338ab63d5bc1c07db9e063729a0647426be38bb7052e6c384d75a: Status 404 returned error can't find the container with id 76594ab690d338ab63d5bc1c07db9e063729a0647426be38bb7052e6c384d75a Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.617499 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:56:24 crc kubenswrapper[4763]: W0930 13:56:24.656534 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b436487_c350_46d9_b1b8_75314d1d4605.slice/crio-5a7918b783b4461ef46f57980d5541f2f0aceaa4e841dfef219959aa4ff39d1c WatchSource:0}: Error finding container 5a7918b783b4461ef46f57980d5541f2f0aceaa4e841dfef219959aa4ff39d1c: Status 404 returned error can't find the container with id 5a7918b783b4461ef46f57980d5541f2f0aceaa4e841dfef219959aa4ff39d1c Sep 30 13:56:24 crc kubenswrapper[4763]: I0930 13:56:24.886585 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-4hcqr"] Sep 30 13:56:25 crc kubenswrapper[4763]: I0930 13:56:25.252881 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lkpxs"] Sep 30 13:56:25 crc kubenswrapper[4763]: I0930 13:56:25.313687 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4da45e-3335-4db0-b45a-b5270296ee35","Type":"ContainerStarted","Data":"e470297eff9b3b4e1f1c0e6de0c4873cc33200389591f49e114d7f7c6d3e4a67"} Sep 30 13:56:25 crc kubenswrapper[4763]: I0930 13:56:25.339634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c2bfb38-17a1-4e35-a5e1-20144611d2e2","Type":"ContainerStarted","Data":"b89af2dd19180896e2bc1030444b672f82d5d1f869a3f2e368156fc6d8d3d488"} Sep 30 13:56:25 crc kubenswrapper[4763]: I0930 13:56:25.351674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c8b04d40-51cf-4a31-b70c-bd760d800aca","Type":"ContainerStarted","Data":"76594ab690d338ab63d5bc1c07db9e063729a0647426be38bb7052e6c384d75a"} Sep 30 13:56:25 crc kubenswrapper[4763]: I0930 13:56:25.358916 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b436487-c350-46d9-b1b8-75314d1d4605","Type":"ContainerStarted","Data":"5a7918b783b4461ef46f57980d5541f2f0aceaa4e841dfef219959aa4ff39d1c"} Sep 30 13:56:25 crc kubenswrapper[4763]: I0930 13:56:25.378428 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sn928" event={"ID":"19caf19d-2082-4c4a-b091-54d6b3d3f1ea","Type":"ContainerStarted","Data":"a1f8a7e885307c1270b15dcfc52ae38978cfd08ad2beb3c528ebc1aba347578d"} Sep 30 13:56:25 crc kubenswrapper[4763]: I0930 13:56:25.378477 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sn928" event={"ID":"19caf19d-2082-4c4a-b091-54d6b3d3f1ea","Type":"ContainerStarted","Data":"a6534599c35c19bc4536a00ebe830c0c1c38020c6f535b6d19dd0ebdf5211117"} Sep 30 13:56:25 crc kubenswrapper[4763]: I0930 13:56:25.391265 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" event={"ID":"db68026f-4475-4db7-a942-3ba486623fc5","Type":"ContainerStarted","Data":"43382424ab8195f91b66c3713af9159c5287bdaac738a2651285cdc26db0a5d8"} Sep 30 13:56:25 crc kubenswrapper[4763]: I0930 13:56:25.412700 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-sn928" podStartSLOduration=2.412676369 podStartE2EDuration="2.412676369s" podCreationTimestamp="2025-09-30 13:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:25.398929625 +0000 UTC m=+1257.537489920" watchObservedRunningTime="2025-09-30 13:56:25.412676369 +0000 UTC m=+1257.551236654" Sep 30 13:56:26 crc kubenswrapper[4763]: I0930 13:56:26.412610 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lkpxs" event={"ID":"df19ca80-6959-485e-b83a-b3c643874684","Type":"ContainerStarted","Data":"a7d873e31826e92d279b98b286bec38c823951291ed5ac5a558bb509d85721d5"} Sep 30 13:56:26 crc kubenswrapper[4763]: I0930 13:56:26.412911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lkpxs" event={"ID":"df19ca80-6959-485e-b83a-b3c643874684","Type":"ContainerStarted","Data":"193c83f310108ad564cef1b9842052442f0d394a5a008b309617f082d5766da0"} Sep 30 13:56:26 crc kubenswrapper[4763]: I0930 13:56:26.418906 4763 generic.go:334] "Generic (PLEG): container finished" podID="db68026f-4475-4db7-a942-3ba486623fc5" containerID="e2fe3c664a0371aca61ca711621315d51b62700f060d37a7874df9f529fec7e9" exitCode=0 Sep 30 13:56:26 crc kubenswrapper[4763]: I0930 13:56:26.419195 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" event={"ID":"db68026f-4475-4db7-a942-3ba486623fc5","Type":"ContainerDied","Data":"e2fe3c664a0371aca61ca711621315d51b62700f060d37a7874df9f529fec7e9"} Sep 30 13:56:26 crc kubenswrapper[4763]: I0930 13:56:26.419270 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:26 crc kubenswrapper[4763]: I0930 13:56:26.419284 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" event={"ID":"db68026f-4475-4db7-a942-3ba486623fc5","Type":"ContainerStarted","Data":"d6a9a0e305070b8965c1c9f3814820c8d0564d3560a0e9623089e72d04b7cb8d"} Sep 30 13:56:26 crc kubenswrapper[4763]: I0930 13:56:26.431383 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-lkpxs" podStartSLOduration=2.431365802 podStartE2EDuration="2.431365802s" podCreationTimestamp="2025-09-30 13:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:26.430989712 +0000 UTC m=+1258.569549997" watchObservedRunningTime="2025-09-30 13:56:26.431365802 +0000 UTC m=+1258.569926077" Sep 30 13:56:26 crc kubenswrapper[4763]: I0930 13:56:26.453833 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" podStartSLOduration=3.453809734 podStartE2EDuration="3.453809734s" podCreationTimestamp="2025-09-30 13:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:26.452806089 +0000 UTC m=+1258.591366404" watchObservedRunningTime="2025-09-30 13:56:26.453809734 +0000 UTC m=+1258.592370029" Sep 30 13:56:27 crc kubenswrapper[4763]: I0930 13:56:27.472831 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:27 crc kubenswrapper[4763]: I0930 13:56:27.654640 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.450984 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4da45e-3335-4db0-b45a-b5270296ee35","Type":"ContainerStarted","Data":"38dd2ae4bc923388e12df617d9e41a05f8f14faba54341609c11baf3dfffbd25"} Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.451657 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4da45e-3335-4db0-b45a-b5270296ee35","Type":"ContainerStarted","Data":"2972898c0e994d6be6707392b64e73292af89d7f46e2b233c34ccb236b53772b"} Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.453690 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" containerName="nova-metadata-log" containerID="cri-o://58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97" gracePeriod=30 Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.453742 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" containerName="nova-metadata-metadata" containerID="cri-o://9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5" gracePeriod=30 Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.453789 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c2bfb38-17a1-4e35-a5e1-20144611d2e2","Type":"ContainerStarted","Data":"9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5"} Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.453825 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c2bfb38-17a1-4e35-a5e1-20144611d2e2","Type":"ContainerStarted","Data":"58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97"} Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.456102 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c8b04d40-51cf-4a31-b70c-bd760d800aca","Type":"ContainerStarted","Data":"272acd02c53a37dfb9729b5a518d4b01ce95a61821dc87ad8baac1d2c9db8bfa"} Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.456256 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="c8b04d40-51cf-4a31-b70c-bd760d800aca" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://272acd02c53a37dfb9729b5a518d4b01ce95a61821dc87ad8baac1d2c9db8bfa" gracePeriod=30 Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.459133 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b436487-c350-46d9-b1b8-75314d1d4605","Type":"ContainerStarted","Data":"c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485"} Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.479796 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.480206921 podStartE2EDuration="6.479774055s" podCreationTimestamp="2025-09-30 13:56:23 +0000 UTC" firstStartedPulling="2025-09-30 13:56:24.403143427 +0000 UTC m=+1256.541703712" lastFinishedPulling="2025-09-30 13:56:28.402710561 +0000 UTC m=+1260.541270846" observedRunningTime="2025-09-30 13:56:29.470173354 +0000 UTC m=+1261.608733659" watchObservedRunningTime="2025-09-30 13:56:29.479774055 +0000 UTC m=+1261.618334340" Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.501899 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5039089150000002 podStartE2EDuration="6.501874799s" podCreationTimestamp="2025-09-30 13:56:23 +0000 UTC" firstStartedPulling="2025-09-30 13:56:24.404826559 +0000 UTC m=+1256.543386844" lastFinishedPulling="2025-09-30 13:56:28.402792443 +0000 UTC m=+1260.541352728" observedRunningTime="2025-09-30 13:56:29.495388016 +0000 UTC m=+1261.633948301" watchObservedRunningTime="2025-09-30 13:56:29.501874799 +0000 UTC m=+1261.640435084" Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.518269 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.728530351 podStartE2EDuration="6.518252968s" podCreationTimestamp="2025-09-30 13:56:23 +0000 UTC" firstStartedPulling="2025-09-30 13:56:24.635812106 +0000 UTC m=+1256.774372391" lastFinishedPulling="2025-09-30 13:56:28.425534723 +0000 UTC m=+1260.564095008" observedRunningTime="2025-09-30 13:56:29.512107445 +0000 UTC m=+1261.650667730" watchObservedRunningTime="2025-09-30 13:56:29.518252968 +0000 UTC m=+1261.656813243" Sep 30 13:56:29 crc kubenswrapper[4763]: I0930 13:56:29.538224 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7942276379999997 podStartE2EDuration="6.538207209s" podCreationTimestamp="2025-09-30 13:56:23 +0000 UTC" firstStartedPulling="2025-09-30 13:56:24.659905979 +0000 UTC m=+1256.798466264" lastFinishedPulling="2025-09-30 13:56:28.40388555 +0000 UTC m=+1260.542445835" observedRunningTime="2025-09-30 13:56:29.527454479 +0000 UTC m=+1261.666014764" watchObservedRunningTime="2025-09-30 13:56:29.538207209 +0000 UTC m=+1261.676767484" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.067868 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.137562 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55jml\" (UniqueName: \"kubernetes.io/projected/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-kube-api-access-55jml\") pod \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.137649 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-combined-ca-bundle\") pod \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.137736 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-logs\") pod \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.137768 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-config-data\") pod \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\" (UID: \"5c2bfb38-17a1-4e35-a5e1-20144611d2e2\") " Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.138142 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-logs" (OuterVolumeSpecName: "logs") pod "5c2bfb38-17a1-4e35-a5e1-20144611d2e2" (UID: "5c2bfb38-17a1-4e35-a5e1-20144611d2e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.138515 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.143153 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-kube-api-access-55jml" (OuterVolumeSpecName: "kube-api-access-55jml") pod "5c2bfb38-17a1-4e35-a5e1-20144611d2e2" (UID: "5c2bfb38-17a1-4e35-a5e1-20144611d2e2"). InnerVolumeSpecName "kube-api-access-55jml". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.171459 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c2bfb38-17a1-4e35-a5e1-20144611d2e2" (UID: "5c2bfb38-17a1-4e35-a5e1-20144611d2e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.175465 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-config-data" (OuterVolumeSpecName: "config-data") pod "5c2bfb38-17a1-4e35-a5e1-20144611d2e2" (UID: "5c2bfb38-17a1-4e35-a5e1-20144611d2e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.239327 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.239363 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55jml\" (UniqueName: \"kubernetes.io/projected/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-kube-api-access-55jml\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.239372 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2bfb38-17a1-4e35-a5e1-20144611d2e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.408680 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.480271 4763 generic.go:334] "Generic (PLEG): container finished" podID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" containerID="9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5" exitCode=0 Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.480643 4763 generic.go:334] "Generic (PLEG): container finished" podID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" containerID="58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97" exitCode=143 Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.480352 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.480311 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c2bfb38-17a1-4e35-a5e1-20144611d2e2","Type":"ContainerDied","Data":"9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5"} Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.480751 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c2bfb38-17a1-4e35-a5e1-20144611d2e2","Type":"ContainerDied","Data":"58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97"} Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.480763 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c2bfb38-17a1-4e35-a5e1-20144611d2e2","Type":"ContainerDied","Data":"b89af2dd19180896e2bc1030444b672f82d5d1f869a3f2e368156fc6d8d3d488"} Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.480794 4763 scope.go:117] "RemoveContainer" containerID="9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.514805 4763 scope.go:117] "RemoveContainer" containerID="58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.556846 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.585063 4763 scope.go:117] "RemoveContainer" containerID="9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5" Sep 30 13:56:30 crc kubenswrapper[4763]: E0930 13:56:30.587109 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5\": container with ID starting with 9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5 not found: ID does not exist" containerID="9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.587164 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5"} err="failed to get container status \"9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5\": rpc error: code = NotFound desc = could not find container \"9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5\": container with ID starting with 9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5 not found: ID does not exist" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.587196 4763 scope.go:117] "RemoveContainer" containerID="58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97" Sep 30 13:56:30 crc kubenswrapper[4763]: E0930 13:56:30.587468 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97\": container with ID starting with 58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97 not found: ID does not exist" containerID="58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.587499 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97"} err="failed to get container status \"58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97\": rpc error: code = NotFound desc = could not find container \"58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97\": container with ID starting with 58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97 not found: ID does not exist" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.587518 4763 scope.go:117] "RemoveContainer" containerID="9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.588562 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5"} err="failed to get container status \"9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5\": rpc error: code = NotFound desc = could not find container \"9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5\": container with ID starting with 9b40627f6be76172ceca99a4ebb4a82a17500082a20c33636893523984c40cd5 not found: ID does not exist" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.588616 4763 scope.go:117] "RemoveContainer" containerID="58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.589014 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97"} err="failed to get container status \"58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97\": rpc error: code = NotFound desc = could not find container \"58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97\": container with ID starting with 58cb41c28dde258b1c903c89a30da6e04f38b43acbe3330b59a9761c6745cf97 not found: ID does not exist" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.593049 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.609075 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:30 crc kubenswrapper[4763]: E0930 13:56:30.609697 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" containerName="nova-metadata-log" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.609719 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" containerName="nova-metadata-log" Sep 30 13:56:30 crc kubenswrapper[4763]: E0930 13:56:30.609763 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" containerName="nova-metadata-metadata" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.609773 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" containerName="nova-metadata-metadata" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.610014 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" containerName="nova-metadata-metadata" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.610055 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" containerName="nova-metadata-log" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.614443 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.618786 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.619000 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.622071 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.657943 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aaebec72-47cd-4ce5-b06b-d4814fba9214-logs\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.658040 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqz44\" (UniqueName: \"kubernetes.io/projected/aaebec72-47cd-4ce5-b06b-d4814fba9214-kube-api-access-lqz44\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.658098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.658158 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-config-data\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.658225 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.759609 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.759656 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aaebec72-47cd-4ce5-b06b-d4814fba9214-logs\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.759720 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqz44\" (UniqueName: \"kubernetes.io/projected/aaebec72-47cd-4ce5-b06b-d4814fba9214-kube-api-access-lqz44\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.759762 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.759823 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-config-data\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.760701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aaebec72-47cd-4ce5-b06b-d4814fba9214-logs\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.763258 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-config-data\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.763413 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.763682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.783101 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqz44\" (UniqueName: \"kubernetes.io/projected/aaebec72-47cd-4ce5-b06b-d4814fba9214-kube-api-access-lqz44\") pod \"nova-metadata-0\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " pod="openstack/nova-metadata-0" Sep 30 13:56:30 crc kubenswrapper[4763]: I0930 13:56:30.941073 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:56:31 crc kubenswrapper[4763]: I0930 13:56:31.479416 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:31 crc kubenswrapper[4763]: I0930 13:56:31.499906 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aaebec72-47cd-4ce5-b06b-d4814fba9214","Type":"ContainerStarted","Data":"d5a37fb3416c4d516dc0b9f99bb5736c17bd3578e4116ddae1bb20d7d8e69e5e"} Sep 30 13:56:32 crc kubenswrapper[4763]: I0930 13:56:32.500754 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2bfb38-17a1-4e35-a5e1-20144611d2e2" path="/var/lib/kubelet/pods/5c2bfb38-17a1-4e35-a5e1-20144611d2e2/volumes" Sep 30 13:56:32 crc kubenswrapper[4763]: I0930 13:56:32.508628 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aaebec72-47cd-4ce5-b06b-d4814fba9214","Type":"ContainerStarted","Data":"686213d4b4263421d8cbb91e30ca2b44816308bdf4ea83099b02ab13fbaf1258"} Sep 30 13:56:32 crc kubenswrapper[4763]: I0930 13:56:32.508887 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aaebec72-47cd-4ce5-b06b-d4814fba9214","Type":"ContainerStarted","Data":"88ae4d3dde2710a1efacdf602a8c10d81c64e05ba43cf75a14a42b318a53d4fe"} Sep 30 13:56:32 crc kubenswrapper[4763]: I0930 13:56:32.530668 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.53064415 podStartE2EDuration="2.53064415s" podCreationTimestamp="2025-09-30 13:56:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:32.525115501 +0000 UTC m=+1264.663675786" watchObservedRunningTime="2025-09-30 13:56:32.53064415 +0000 UTC m=+1264.669204435" Sep 30 13:56:33 crc kubenswrapper[4763]: I0930 13:56:33.678229 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:56:33 crc kubenswrapper[4763]: I0930 13:56:33.679392 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:56:33 crc kubenswrapper[4763]: I0930 13:56:33.955226 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 13:56:33 crc kubenswrapper[4763]: I0930 13:56:33.955290 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 13:56:33 crc kubenswrapper[4763]: I0930 13:56:33.955307 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:56:33 crc kubenswrapper[4763]: I0930 13:56:33.985748 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.140032 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.224133 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-gc7p2"] Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.224767 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" podUID="7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" containerName="dnsmasq-dns" containerID="cri-o://011c47ce23ad6f62f6305136adfd0a8f49ce729205d0d49b56f81c72fd11eaf4" gracePeriod=10 Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.530563 4763 generic.go:334] "Generic (PLEG): container finished" podID="7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" containerID="011c47ce23ad6f62f6305136adfd0a8f49ce729205d0d49b56f81c72fd11eaf4" exitCode=0 Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.530629 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" event={"ID":"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0","Type":"ContainerDied","Data":"011c47ce23ad6f62f6305136adfd0a8f49ce729205d0d49b56f81c72fd11eaf4"} Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.533348 4763 generic.go:334] "Generic (PLEG): container finished" podID="19caf19d-2082-4c4a-b091-54d6b3d3f1ea" containerID="a1f8a7e885307c1270b15dcfc52ae38978cfd08ad2beb3c528ebc1aba347578d" exitCode=0 Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.533418 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sn928" event={"ID":"19caf19d-2082-4c4a-b091-54d6b3d3f1ea","Type":"ContainerDied","Data":"a1f8a7e885307c1270b15dcfc52ae38978cfd08ad2beb3c528ebc1aba347578d"} Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.535123 4763 generic.go:334] "Generic (PLEG): container finished" podID="df19ca80-6959-485e-b83a-b3c643874684" containerID="a7d873e31826e92d279b98b286bec38c823951291ed5ac5a558bb509d85721d5" exitCode=0 Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.535212 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lkpxs" event={"ID":"df19ca80-6959-485e-b83a-b3c643874684","Type":"ContainerDied","Data":"a7d873e31826e92d279b98b286bec38c823951291ed5ac5a558bb509d85721d5"} Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.586532 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.760807 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.760827 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.791059 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.837290 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-sb\") pod \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.837356 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-swift-storage-0\") pod \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.837415 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-svc\") pod \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.837477 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-config\") pod \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.837531 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-nb\") pod \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.837589 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bjx5\" (UniqueName: \"kubernetes.io/projected/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-kube-api-access-5bjx5\") pod \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\" (UID: \"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0\") " Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.858846 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-kube-api-access-5bjx5" (OuterVolumeSpecName: "kube-api-access-5bjx5") pod "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" (UID: "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0"). InnerVolumeSpecName "kube-api-access-5bjx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.922472 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" (UID: "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.932330 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" (UID: "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.935026 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-config" (OuterVolumeSpecName: "config") pod "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" (UID: "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.941612 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.941654 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.941669 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.941683 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bjx5\" (UniqueName: \"kubernetes.io/projected/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-kube-api-access-5bjx5\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.958128 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" (UID: "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:34 crc kubenswrapper[4763]: I0930 13:56:34.989270 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" (UID: "7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.054354 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.054401 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.094654 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.094895 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4412eaea-f645-451a-8b88-c562357c6b1e" containerName="kube-state-metrics" containerID="cri-o://0d0112a1787094253153b3a60f8663407e6cb545baa23acb0b8b21ec9335b321" gracePeriod=30 Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.563009 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" event={"ID":"7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0","Type":"ContainerDied","Data":"8f95e7762a85d9259af0e728e2135623de9492235fcb0b05213d5812754912d9"} Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.563066 4763 scope.go:117] "RemoveContainer" containerID="011c47ce23ad6f62f6305136adfd0a8f49ce729205d0d49b56f81c72fd11eaf4" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.563167 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c47bb5d77-gc7p2" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.599277 4763 generic.go:334] "Generic (PLEG): container finished" podID="4412eaea-f645-451a-8b88-c562357c6b1e" containerID="0d0112a1787094253153b3a60f8663407e6cb545baa23acb0b8b21ec9335b321" exitCode=2 Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.599472 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4412eaea-f645-451a-8b88-c562357c6b1e","Type":"ContainerDied","Data":"0d0112a1787094253153b3a60f8663407e6cb545baa23acb0b8b21ec9335b321"} Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.599506 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4412eaea-f645-451a-8b88-c562357c6b1e","Type":"ContainerDied","Data":"a76224d57b415530c39bb22b5ef2b9e213b179a97d3acc51904ce4f30b9cd44b"} Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.599521 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a76224d57b415530c39bb22b5ef2b9e213b179a97d3acc51904ce4f30b9cd44b" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.671040 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.675501 4763 scope.go:117] "RemoveContainer" containerID="3bb6c1094febeddb8fff27dc43669fa2c2470b5e86ec7be0f01dc90f42dfb92b" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.686819 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-gc7p2"] Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.698399 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-gc7p2"] Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.866998 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k9ck\" (UniqueName: \"kubernetes.io/projected/4412eaea-f645-451a-8b88-c562357c6b1e-kube-api-access-6k9ck\") pod \"4412eaea-f645-451a-8b88-c562357c6b1e\" (UID: \"4412eaea-f645-451a-8b88-c562357c6b1e\") " Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.871792 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4412eaea-f645-451a-8b88-c562357c6b1e-kube-api-access-6k9ck" (OuterVolumeSpecName: "kube-api-access-6k9ck") pod "4412eaea-f645-451a-8b88-c562357c6b1e" (UID: "4412eaea-f645-451a-8b88-c562357c6b1e"). InnerVolumeSpecName "kube-api-access-6k9ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.941860 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.941912 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.963887 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:35 crc kubenswrapper[4763]: I0930 13:56:35.969731 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k9ck\" (UniqueName: \"kubernetes.io/projected/4412eaea-f645-451a-8b88-c562357c6b1e-kube-api-access-6k9ck\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.036721 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.060623 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.060681 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.072507 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-scripts\") pod \"df19ca80-6959-485e-b83a-b3c643874684\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.072756 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-combined-ca-bundle\") pod \"df19ca80-6959-485e-b83a-b3c643874684\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.072917 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-config-data\") pod \"df19ca80-6959-485e-b83a-b3c643874684\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.073040 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68kwj\" (UniqueName: \"kubernetes.io/projected/df19ca80-6959-485e-b83a-b3c643874684-kube-api-access-68kwj\") pod \"df19ca80-6959-485e-b83a-b3c643874684\" (UID: \"df19ca80-6959-485e-b83a-b3c643874684\") " Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.077574 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df19ca80-6959-485e-b83a-b3c643874684-kube-api-access-68kwj" (OuterVolumeSpecName: "kube-api-access-68kwj") pod "df19ca80-6959-485e-b83a-b3c643874684" (UID: "df19ca80-6959-485e-b83a-b3c643874684"). InnerVolumeSpecName "kube-api-access-68kwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.084011 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-scripts" (OuterVolumeSpecName: "scripts") pod "df19ca80-6959-485e-b83a-b3c643874684" (UID: "df19ca80-6959-485e-b83a-b3c643874684"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.114998 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-config-data" (OuterVolumeSpecName: "config-data") pod "df19ca80-6959-485e-b83a-b3c643874684" (UID: "df19ca80-6959-485e-b83a-b3c643874684"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.118748 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df19ca80-6959-485e-b83a-b3c643874684" (UID: "df19ca80-6959-485e-b83a-b3c643874684"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.174936 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-config-data\") pod \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.175119 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwm79\" (UniqueName: \"kubernetes.io/projected/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-kube-api-access-qwm79\") pod \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.175201 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-scripts\") pod \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.175251 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-combined-ca-bundle\") pod \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\" (UID: \"19caf19d-2082-4c4a-b091-54d6b3d3f1ea\") " Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.175808 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.175831 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68kwj\" (UniqueName: \"kubernetes.io/projected/df19ca80-6959-485e-b83a-b3c643874684-kube-api-access-68kwj\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.175841 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.175852 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df19ca80-6959-485e-b83a-b3c643874684-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.178442 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-kube-api-access-qwm79" (OuterVolumeSpecName: "kube-api-access-qwm79") pod "19caf19d-2082-4c4a-b091-54d6b3d3f1ea" (UID: "19caf19d-2082-4c4a-b091-54d6b3d3f1ea"). InnerVolumeSpecName "kube-api-access-qwm79". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.182709 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-scripts" (OuterVolumeSpecName: "scripts") pod "19caf19d-2082-4c4a-b091-54d6b3d3f1ea" (UID: "19caf19d-2082-4c4a-b091-54d6b3d3f1ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.202414 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-config-data" (OuterVolumeSpecName: "config-data") pod "19caf19d-2082-4c4a-b091-54d6b3d3f1ea" (UID: "19caf19d-2082-4c4a-b091-54d6b3d3f1ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.208043 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19caf19d-2082-4c4a-b091-54d6b3d3f1ea" (UID: "19caf19d-2082-4c4a-b091-54d6b3d3f1ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.278107 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.278149 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwm79\" (UniqueName: \"kubernetes.io/projected/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-kube-api-access-qwm79\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.278165 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.278176 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19caf19d-2082-4c4a-b091-54d6b3d3f1ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.501206 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" path="/var/lib/kubelet/pods/7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0/volumes" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.612502 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sn928" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.612558 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sn928" event={"ID":"19caf19d-2082-4c4a-b091-54d6b3d3f1ea","Type":"ContainerDied","Data":"a6534599c35c19bc4536a00ebe830c0c1c38020c6f535b6d19dd0ebdf5211117"} Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.612983 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6534599c35c19bc4536a00ebe830c0c1c38020c6f535b6d19dd0ebdf5211117" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.614743 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.614741 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lkpxs" event={"ID":"df19ca80-6959-485e-b83a-b3c643874684","Type":"ContainerDied","Data":"193c83f310108ad564cef1b9842052442f0d394a5a008b309617f082d5766da0"} Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.614801 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lkpxs" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.614824 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="193c83f310108ad564cef1b9842052442f0d394a5a008b309617f082d5766da0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.710844 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.749617 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.760077 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:56:36 crc kubenswrapper[4763]: E0930 13:56:36.760941 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4412eaea-f645-451a-8b88-c562357c6b1e" containerName="kube-state-metrics" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.760993 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4412eaea-f645-451a-8b88-c562357c6b1e" containerName="kube-state-metrics" Sep 30 13:56:36 crc kubenswrapper[4763]: E0930 13:56:36.761020 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" containerName="init" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.761027 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" containerName="init" Sep 30 13:56:36 crc kubenswrapper[4763]: E0930 13:56:36.761070 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19caf19d-2082-4c4a-b091-54d6b3d3f1ea" containerName="nova-manage" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.761078 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="19caf19d-2082-4c4a-b091-54d6b3d3f1ea" containerName="nova-manage" Sep 30 13:56:36 crc kubenswrapper[4763]: E0930 13:56:36.761089 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" containerName="dnsmasq-dns" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.761096 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" containerName="dnsmasq-dns" Sep 30 13:56:36 crc kubenswrapper[4763]: E0930 13:56:36.761160 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df19ca80-6959-485e-b83a-b3c643874684" containerName="nova-cell1-conductor-db-sync" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.761169 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="df19ca80-6959-485e-b83a-b3c643874684" containerName="nova-cell1-conductor-db-sync" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.761504 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="df19ca80-6959-485e-b83a-b3c643874684" containerName="nova-cell1-conductor-db-sync" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.761522 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="19caf19d-2082-4c4a-b091-54d6b3d3f1ea" containerName="nova-manage" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.761538 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f0c4ebb-7bf6-43f3-8880-8e4e9f9874b0" containerName="dnsmasq-dns" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.761588 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4412eaea-f645-451a-8b88-c562357c6b1e" containerName="kube-state-metrics" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.762704 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.767030 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.767245 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.786920 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.797656 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.799069 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.808636 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.809207 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.890244 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.891441 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.891755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.891941 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9th2\" (UniqueName: \"kubernetes.io/projected/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-api-access-m9th2\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.916470 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.924012 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aaebec72-47cd-4ce5-b06b-d4814fba9214" containerName="nova-metadata-log" containerID="cri-o://88ae4d3dde2710a1efacdf602a8c10d81c64e05ba43cf75a14a42b318a53d4fe" gracePeriod=30 Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.924075 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aaebec72-47cd-4ce5-b06b-d4814fba9214" containerName="nova-metadata-metadata" containerID="cri-o://686213d4b4263421d8cbb91e30ca2b44816308bdf4ea83099b02ab13fbaf1258" gracePeriod=30 Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.944223 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.944490 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerName="nova-api-log" containerID="cri-o://2972898c0e994d6be6707392b64e73292af89d7f46e2b233c34ccb236b53772b" gracePeriod=30 Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.944575 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerName="nova-api-api" containerID="cri-o://38dd2ae4bc923388e12df617d9e41a05f8f14faba54341609c11baf3dfffbd25" gracePeriod=30 Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.960474 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.960720 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2b436487-c350-46d9-b1b8-75314d1d4605" containerName="nova-scheduler-scheduler" containerID="cri-o://c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485" gracePeriod=30 Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.993265 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.993308 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.993353 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7959k\" (UniqueName: \"kubernetes.io/projected/aefbc43e-494e-48a6-963c-7be9d0159387-kube-api-access-7959k\") pod \"nova-cell1-conductor-0\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.993423 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.993446 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.993490 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.993549 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9th2\" (UniqueName: \"kubernetes.io/projected/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-api-access-m9th2\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.997709 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:36 crc kubenswrapper[4763]: I0930 13:56:36.997792 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.002052 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.025019 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9th2\" (UniqueName: \"kubernetes.io/projected/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-api-access-m9th2\") pod \"kube-state-metrics-0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " pod="openstack/kube-state-metrics-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.087330 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.094709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7959k\" (UniqueName: \"kubernetes.io/projected/aefbc43e-494e-48a6-963c-7be9d0159387-kube-api-access-7959k\") pod \"nova-cell1-conductor-0\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.094811 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.094835 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.098904 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.099235 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.116704 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7959k\" (UniqueName: \"kubernetes.io/projected/aefbc43e-494e-48a6-963c-7be9d0159387-kube-api-access-7959k\") pod \"nova-cell1-conductor-0\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.138205 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.606363 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.609267 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="ceilometer-central-agent" containerID="cri-o://5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da" gracePeriod=30 Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.609633 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="sg-core" containerID="cri-o://546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873" gracePeriod=30 Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.609637 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="ceilometer-notification-agent" containerID="cri-o://1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426" gracePeriod=30 Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.609676 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="proxy-httpd" containerID="cri-o://79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518" gracePeriod=30 Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.639152 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerID="2972898c0e994d6be6707392b64e73292af89d7f46e2b233c34ccb236b53772b" exitCode=143 Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.639296 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4da45e-3335-4db0-b45a-b5270296ee35","Type":"ContainerDied","Data":"2972898c0e994d6be6707392b64e73292af89d7f46e2b233c34ccb236b53772b"} Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.656973 4763 generic.go:334] "Generic (PLEG): container finished" podID="aaebec72-47cd-4ce5-b06b-d4814fba9214" containerID="686213d4b4263421d8cbb91e30ca2b44816308bdf4ea83099b02ab13fbaf1258" exitCode=0 Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.657004 4763 generic.go:334] "Generic (PLEG): container finished" podID="aaebec72-47cd-4ce5-b06b-d4814fba9214" containerID="88ae4d3dde2710a1efacdf602a8c10d81c64e05ba43cf75a14a42b318a53d4fe" exitCode=143 Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.657030 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aaebec72-47cd-4ce5-b06b-d4814fba9214","Type":"ContainerDied","Data":"686213d4b4263421d8cbb91e30ca2b44816308bdf4ea83099b02ab13fbaf1258"} Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.657066 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aaebec72-47cd-4ce5-b06b-d4814fba9214","Type":"ContainerDied","Data":"88ae4d3dde2710a1efacdf602a8c10d81c64e05ba43cf75a14a42b318a53d4fe"} Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.657081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aaebec72-47cd-4ce5-b06b-d4814fba9214","Type":"ContainerDied","Data":"d5a37fb3416c4d516dc0b9f99bb5736c17bd3578e4116ddae1bb20d7d8e69e5e"} Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.657093 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5a37fb3416c4d516dc0b9f99bb5736c17bd3578e4116ddae1bb20d7d8e69e5e" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.662194 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.777378 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.811189 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-config-data\") pod \"aaebec72-47cd-4ce5-b06b-d4814fba9214\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.811309 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqz44\" (UniqueName: \"kubernetes.io/projected/aaebec72-47cd-4ce5-b06b-d4814fba9214-kube-api-access-lqz44\") pod \"aaebec72-47cd-4ce5-b06b-d4814fba9214\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.811349 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-combined-ca-bundle\") pod \"aaebec72-47cd-4ce5-b06b-d4814fba9214\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.811414 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aaebec72-47cd-4ce5-b06b-d4814fba9214-logs\") pod \"aaebec72-47cd-4ce5-b06b-d4814fba9214\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.811447 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-nova-metadata-tls-certs\") pod \"aaebec72-47cd-4ce5-b06b-d4814fba9214\" (UID: \"aaebec72-47cd-4ce5-b06b-d4814fba9214\") " Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.817767 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaebec72-47cd-4ce5-b06b-d4814fba9214-logs" (OuterVolumeSpecName: "logs") pod "aaebec72-47cd-4ce5-b06b-d4814fba9214" (UID: "aaebec72-47cd-4ce5-b06b-d4814fba9214"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.821544 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaebec72-47cd-4ce5-b06b-d4814fba9214-kube-api-access-lqz44" (OuterVolumeSpecName: "kube-api-access-lqz44") pod "aaebec72-47cd-4ce5-b06b-d4814fba9214" (UID: "aaebec72-47cd-4ce5-b06b-d4814fba9214"). InnerVolumeSpecName "kube-api-access-lqz44". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.836292 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.852067 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaebec72-47cd-4ce5-b06b-d4814fba9214" (UID: "aaebec72-47cd-4ce5-b06b-d4814fba9214"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.877465 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-config-data" (OuterVolumeSpecName: "config-data") pod "aaebec72-47cd-4ce5-b06b-d4814fba9214" (UID: "aaebec72-47cd-4ce5-b06b-d4814fba9214"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.897898 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "aaebec72-47cd-4ce5-b06b-d4814fba9214" (UID: "aaebec72-47cd-4ce5-b06b-d4814fba9214"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.913034 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.913074 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqz44\" (UniqueName: \"kubernetes.io/projected/aaebec72-47cd-4ce5-b06b-d4814fba9214-kube-api-access-lqz44\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.913127 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.913139 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aaebec72-47cd-4ce5-b06b-d4814fba9214-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:37 crc kubenswrapper[4763]: I0930 13:56:37.913149 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaebec72-47cd-4ce5-b06b-d4814fba9214-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:38 crc kubenswrapper[4763]: E0930 13:56:38.125493 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffea96d1_fdc0_46c6_bd7c_66b023bc6ccc.slice/crio-5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffea96d1_fdc0_46c6_bd7c_66b023bc6ccc.slice/crio-conmon-5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.526899 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4412eaea-f645-451a-8b88-c562357c6b1e" path="/var/lib/kubelet/pods/4412eaea-f645-451a-8b88-c562357c6b1e/volumes" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.667806 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aefbc43e-494e-48a6-963c-7be9d0159387","Type":"ContainerStarted","Data":"585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1"} Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.667879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aefbc43e-494e-48a6-963c-7be9d0159387","Type":"ContainerStarted","Data":"dcdc4f01401a7f30f315ce3f13aa64e2c734af8d34f6c95274ea3b5881d96cf8"} Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.667960 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.673588 4763 generic.go:334] "Generic (PLEG): container finished" podID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerID="79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518" exitCode=0 Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.673640 4763 generic.go:334] "Generic (PLEG): container finished" podID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerID="546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873" exitCode=2 Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.673650 4763 generic.go:334] "Generic (PLEG): container finished" podID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerID="5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da" exitCode=0 Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.673695 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc","Type":"ContainerDied","Data":"79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518"} Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.673722 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc","Type":"ContainerDied","Data":"546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873"} Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.673735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc","Type":"ContainerDied","Data":"5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da"} Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.677875 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.679319 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0","Type":"ContainerStarted","Data":"a9c41d23775fc1eb5013bbc346ef1bd994014c9076f54a0e50116c9cc0474cc7"} Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.679351 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0","Type":"ContainerStarted","Data":"a1e33831828add2c6d80f40a159bc893182b58884a01730e224a39208c1ddd8b"} Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.679369 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.702919 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.702853903 podStartE2EDuration="2.702853903s" podCreationTimestamp="2025-09-30 13:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:38.690110914 +0000 UTC m=+1270.828671199" watchObservedRunningTime="2025-09-30 13:56:38.702853903 +0000 UTC m=+1270.841414208" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.713416 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.728337 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.748980 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:38 crc kubenswrapper[4763]: E0930 13:56:38.749459 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaebec72-47cd-4ce5-b06b-d4814fba9214" containerName="nova-metadata-log" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.749496 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaebec72-47cd-4ce5-b06b-d4814fba9214" containerName="nova-metadata-log" Sep 30 13:56:38 crc kubenswrapper[4763]: E0930 13:56:38.749519 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaebec72-47cd-4ce5-b06b-d4814fba9214" containerName="nova-metadata-metadata" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.749528 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaebec72-47cd-4ce5-b06b-d4814fba9214" containerName="nova-metadata-metadata" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.749770 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaebec72-47cd-4ce5-b06b-d4814fba9214" containerName="nova-metadata-log" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.749803 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaebec72-47cd-4ce5-b06b-d4814fba9214" containerName="nova-metadata-metadata" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.750967 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.753203 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.753797 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.779945 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.406339364 podStartE2EDuration="2.779920554s" podCreationTimestamp="2025-09-30 13:56:36 +0000 UTC" firstStartedPulling="2025-09-30 13:56:37.676792877 +0000 UTC m=+1269.815353172" lastFinishedPulling="2025-09-30 13:56:38.050374077 +0000 UTC m=+1270.188934362" observedRunningTime="2025-09-30 13:56:38.728001794 +0000 UTC m=+1270.866562079" watchObservedRunningTime="2025-09-30 13:56:38.779920554 +0000 UTC m=+1270.918480839" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.789816 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.835103 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-logs\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.835187 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhgtl\" (UniqueName: \"kubernetes.io/projected/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-kube-api-access-nhgtl\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.835227 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-config-data\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.835625 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.835673 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.937105 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.937163 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.937200 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-logs\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.937258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhgtl\" (UniqueName: \"kubernetes.io/projected/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-kube-api-access-nhgtl\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.937298 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-config-data\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.942307 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.944858 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-logs\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.945085 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.945526 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-config-data\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: I0930 13:56:38.961854 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhgtl\" (UniqueName: \"kubernetes.io/projected/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-kube-api-access-nhgtl\") pod \"nova-metadata-0\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " pod="openstack/nova-metadata-0" Sep 30 13:56:38 crc kubenswrapper[4763]: E0930 13:56:38.969501 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:56:38 crc kubenswrapper[4763]: E0930 13:56:38.971446 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:56:38 crc kubenswrapper[4763]: E0930 13:56:38.978004 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:56:38 crc kubenswrapper[4763]: E0930 13:56:38.978097 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2b436487-c350-46d9-b1b8-75314d1d4605" containerName="nova-scheduler-scheduler" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.069229 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.154913 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.260216 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-log-httpd\") pod \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.260386 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-sg-core-conf-yaml\") pod \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.260492 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-scripts\") pod \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.260527 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbcxd\" (UniqueName: \"kubernetes.io/projected/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-kube-api-access-vbcxd\") pod \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.260625 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-combined-ca-bundle\") pod \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.260666 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-config-data\") pod \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.260689 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-run-httpd\") pod \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\" (UID: \"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc\") " Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.261434 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" (UID: "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.261474 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" (UID: "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.270354 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-kube-api-access-vbcxd" (OuterVolumeSpecName: "kube-api-access-vbcxd") pod "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" (UID: "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc"). InnerVolumeSpecName "kube-api-access-vbcxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.280200 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-scripts" (OuterVolumeSpecName: "scripts") pod "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" (UID: "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.323295 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" (UID: "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.363277 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.363317 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.363330 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.363342 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbcxd\" (UniqueName: \"kubernetes.io/projected/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-kube-api-access-vbcxd\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.363353 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.380935 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" (UID: "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.407748 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-config-data" (OuterVolumeSpecName: "config-data") pod "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" (UID: "ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.464863 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.464898 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.609300 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.688731 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32b6e2c8-d14f-4f03-b830-d9ef617b81f9","Type":"ContainerStarted","Data":"af228d55e8797fc37b8e9e232ec7e7e18528d183e913dfd1bce70fb4b4818a1e"} Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.691922 4763 generic.go:334] "Generic (PLEG): container finished" podID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerID="1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426" exitCode=0 Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.692788 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.693222 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc","Type":"ContainerDied","Data":"1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426"} Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.693272 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc","Type":"ContainerDied","Data":"e6994d25a1dac58897ef4e07d7cebfb95c9aaa20835e21bc4e231582bfc7b27a"} Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.693298 4763 scope.go:117] "RemoveContainer" containerID="79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.726408 4763 scope.go:117] "RemoveContainer" containerID="546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.759004 4763 scope.go:117] "RemoveContainer" containerID="1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.775665 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.785114 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.793077 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:39 crc kubenswrapper[4763]: E0930 13:56:39.793452 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="proxy-httpd" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.793469 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="proxy-httpd" Sep 30 13:56:39 crc kubenswrapper[4763]: E0930 13:56:39.793489 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="sg-core" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.793495 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="sg-core" Sep 30 13:56:39 crc kubenswrapper[4763]: E0930 13:56:39.793513 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="ceilometer-notification-agent" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.793519 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="ceilometer-notification-agent" Sep 30 13:56:39 crc kubenswrapper[4763]: E0930 13:56:39.793528 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="ceilometer-central-agent" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.793534 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="ceilometer-central-agent" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.793730 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="sg-core" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.793747 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="ceilometer-notification-agent" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.793753 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="proxy-httpd" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.793769 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" containerName="ceilometer-central-agent" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.795365 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.796755 4763 scope.go:117] "RemoveContainer" containerID="5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.802411 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.802694 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.802849 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.816423 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.829759 4763 scope.go:117] "RemoveContainer" containerID="79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518" Sep 30 13:56:39 crc kubenswrapper[4763]: E0930 13:56:39.858950 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518\": container with ID starting with 79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518 not found: ID does not exist" containerID="79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.858995 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518"} err="failed to get container status \"79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518\": rpc error: code = NotFound desc = could not find container \"79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518\": container with ID starting with 79b9e91861d5be4632a604b8e9978da3f08b9aa0129907367fc4272568186518 not found: ID does not exist" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.859026 4763 scope.go:117] "RemoveContainer" containerID="546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873" Sep 30 13:56:39 crc kubenswrapper[4763]: E0930 13:56:39.867003 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873\": container with ID starting with 546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873 not found: ID does not exist" containerID="546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.867043 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873"} err="failed to get container status \"546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873\": rpc error: code = NotFound desc = could not find container \"546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873\": container with ID starting with 546e9a60539ea9e9735abc916cafe9343f535f64a90f2776a42961e62bbf4873 not found: ID does not exist" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.867073 4763 scope.go:117] "RemoveContainer" containerID="1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426" Sep 30 13:56:39 crc kubenswrapper[4763]: E0930 13:56:39.878910 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426\": container with ID starting with 1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426 not found: ID does not exist" containerID="1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.878953 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426"} err="failed to get container status \"1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426\": rpc error: code = NotFound desc = could not find container \"1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426\": container with ID starting with 1990fa7207bb4617262233205e1cef942f0fab0661612f8cc5caec120a254426 not found: ID does not exist" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.878980 4763 scope.go:117] "RemoveContainer" containerID="5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da" Sep 30 13:56:39 crc kubenswrapper[4763]: E0930 13:56:39.882696 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da\": container with ID starting with 5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da not found: ID does not exist" containerID="5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.882741 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da"} err="failed to get container status \"5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da\": rpc error: code = NotFound desc = could not find container \"5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da\": container with ID starting with 5ed20f09da9aaa8e2c6235d87717a17dc8ec61f642a8c8d26f0c3d63044ee1da not found: ID does not exist" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.976709 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kbhq\" (UniqueName: \"kubernetes.io/projected/2fc26096-0b60-4cc8-9d34-a33991f7eae1-kube-api-access-4kbhq\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.976769 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-run-httpd\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.976824 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-log-httpd\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.976852 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.976904 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.976952 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.976984 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-config-data\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:39 crc kubenswrapper[4763]: I0930 13:56:39.977012 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-scripts\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.080038 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.080103 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-config-data\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.080145 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-scripts\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.080214 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kbhq\" (UniqueName: \"kubernetes.io/projected/2fc26096-0b60-4cc8-9d34-a33991f7eae1-kube-api-access-4kbhq\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.080271 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-run-httpd\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.080350 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-log-httpd\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.080395 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.080477 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.082563 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-log-httpd\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.083050 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-run-httpd\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.084574 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-scripts\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.085046 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-config-data\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.085513 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.086144 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.086436 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.110343 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kbhq\" (UniqueName: \"kubernetes.io/projected/2fc26096-0b60-4cc8-9d34-a33991f7eae1-kube-api-access-4kbhq\") pod \"ceilometer-0\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.119506 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.509139 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaebec72-47cd-4ce5-b06b-d4814fba9214" path="/var/lib/kubelet/pods/aaebec72-47cd-4ce5-b06b-d4814fba9214/volumes" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.510355 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc" path="/var/lib/kubelet/pods/ffea96d1-fdc0-46c6-bd7c-66b023bc6ccc/volumes" Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.583252 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.702771 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32b6e2c8-d14f-4f03-b830-d9ef617b81f9","Type":"ContainerStarted","Data":"3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248"} Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.703082 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32b6e2c8-d14f-4f03-b830-d9ef617b81f9","Type":"ContainerStarted","Data":"6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505"} Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.708321 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fc26096-0b60-4cc8-9d34-a33991f7eae1","Type":"ContainerStarted","Data":"eabb5f4f6260d51bc0c39c60650d0d21b81c7c23dae4c5aef5b42d84d2d6a9a5"} Sep 30 13:56:40 crc kubenswrapper[4763]: I0930 13:56:40.743174 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.74315498 podStartE2EDuration="2.74315498s" podCreationTimestamp="2025-09-30 13:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:40.722042561 +0000 UTC m=+1272.860602866" watchObservedRunningTime="2025-09-30 13:56:40.74315498 +0000 UTC m=+1272.881715265" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.130943 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.225863 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hcd7\" (UniqueName: \"kubernetes.io/projected/2b436487-c350-46d9-b1b8-75314d1d4605-kube-api-access-8hcd7\") pod \"2b436487-c350-46d9-b1b8-75314d1d4605\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.226225 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-combined-ca-bundle\") pod \"2b436487-c350-46d9-b1b8-75314d1d4605\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.226304 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-config-data\") pod \"2b436487-c350-46d9-b1b8-75314d1d4605\" (UID: \"2b436487-c350-46d9-b1b8-75314d1d4605\") " Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.230802 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b436487-c350-46d9-b1b8-75314d1d4605-kube-api-access-8hcd7" (OuterVolumeSpecName: "kube-api-access-8hcd7") pod "2b436487-c350-46d9-b1b8-75314d1d4605" (UID: "2b436487-c350-46d9-b1b8-75314d1d4605"). InnerVolumeSpecName "kube-api-access-8hcd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.262384 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b436487-c350-46d9-b1b8-75314d1d4605" (UID: "2b436487-c350-46d9-b1b8-75314d1d4605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.265085 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-config-data" (OuterVolumeSpecName: "config-data") pod "2b436487-c350-46d9-b1b8-75314d1d4605" (UID: "2b436487-c350-46d9-b1b8-75314d1d4605"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.328306 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hcd7\" (UniqueName: \"kubernetes.io/projected/2b436487-c350-46d9-b1b8-75314d1d4605-kube-api-access-8hcd7\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.328342 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.328351 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b436487-c350-46d9-b1b8-75314d1d4605-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.730116 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fc26096-0b60-4cc8-9d34-a33991f7eae1","Type":"ContainerStarted","Data":"4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05"} Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.732890 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerID="38dd2ae4bc923388e12df617d9e41a05f8f14faba54341609c11baf3dfffbd25" exitCode=0 Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.732949 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4da45e-3335-4db0-b45a-b5270296ee35","Type":"ContainerDied","Data":"38dd2ae4bc923388e12df617d9e41a05f8f14faba54341609c11baf3dfffbd25"} Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.736124 4763 generic.go:334] "Generic (PLEG): container finished" podID="2b436487-c350-46d9-b1b8-75314d1d4605" containerID="c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485" exitCode=0 Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.736541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b436487-c350-46d9-b1b8-75314d1d4605","Type":"ContainerDied","Data":"c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485"} Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.736574 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.736580 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b436487-c350-46d9-b1b8-75314d1d4605","Type":"ContainerDied","Data":"5a7918b783b4461ef46f57980d5541f2f0aceaa4e841dfef219959aa4ff39d1c"} Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.736594 4763 scope.go:117] "RemoveContainer" containerID="c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.788897 4763 scope.go:117] "RemoveContainer" containerID="c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485" Sep 30 13:56:41 crc kubenswrapper[4763]: E0930 13:56:41.790124 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485\": container with ID starting with c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485 not found: ID does not exist" containerID="c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.790170 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485"} err="failed to get container status \"c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485\": rpc error: code = NotFound desc = could not find container \"c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485\": container with ID starting with c293a76306bc2b810060e2457d9f7d9d2d44cfca5cbb033d1dcc67b71eef3485 not found: ID does not exist" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.793988 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.828457 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.844394 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:56:41 crc kubenswrapper[4763]: E0930 13:56:41.845064 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b436487-c350-46d9-b1b8-75314d1d4605" containerName="nova-scheduler-scheduler" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.845084 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b436487-c350-46d9-b1b8-75314d1d4605" containerName="nova-scheduler-scheduler" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.845262 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b436487-c350-46d9-b1b8-75314d1d4605" containerName="nova-scheduler-scheduler" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.845856 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.853418 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.857071 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.936537 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.937665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4hm4\" (UniqueName: \"kubernetes.io/projected/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-kube-api-access-f4hm4\") pod \"nova-scheduler-0\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.937754 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:41 crc kubenswrapper[4763]: I0930 13:56:41.937902 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-config-data\") pod \"nova-scheduler-0\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.038790 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-combined-ca-bundle\") pod \"ba4da45e-3335-4db0-b45a-b5270296ee35\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.038895 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4da45e-3335-4db0-b45a-b5270296ee35-logs\") pod \"ba4da45e-3335-4db0-b45a-b5270296ee35\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.038927 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-config-data\") pod \"ba4da45e-3335-4db0-b45a-b5270296ee35\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.039083 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdwh2\" (UniqueName: \"kubernetes.io/projected/ba4da45e-3335-4db0-b45a-b5270296ee35-kube-api-access-bdwh2\") pod \"ba4da45e-3335-4db0-b45a-b5270296ee35\" (UID: \"ba4da45e-3335-4db0-b45a-b5270296ee35\") " Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.039426 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-config-data\") pod \"nova-scheduler-0\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.039507 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4hm4\" (UniqueName: \"kubernetes.io/projected/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-kube-api-access-f4hm4\") pod \"nova-scheduler-0\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.039577 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.040767 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba4da45e-3335-4db0-b45a-b5270296ee35-logs" (OuterVolumeSpecName: "logs") pod "ba4da45e-3335-4db0-b45a-b5270296ee35" (UID: "ba4da45e-3335-4db0-b45a-b5270296ee35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.046189 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.046960 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba4da45e-3335-4db0-b45a-b5270296ee35-kube-api-access-bdwh2" (OuterVolumeSpecName: "kube-api-access-bdwh2") pod "ba4da45e-3335-4db0-b45a-b5270296ee35" (UID: "ba4da45e-3335-4db0-b45a-b5270296ee35"). InnerVolumeSpecName "kube-api-access-bdwh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.055208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-config-data\") pod \"nova-scheduler-0\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.058394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4hm4\" (UniqueName: \"kubernetes.io/projected/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-kube-api-access-f4hm4\") pod \"nova-scheduler-0\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " pod="openstack/nova-scheduler-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.068404 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba4da45e-3335-4db0-b45a-b5270296ee35" (UID: "ba4da45e-3335-4db0-b45a-b5270296ee35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.068768 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-config-data" (OuterVolumeSpecName: "config-data") pod "ba4da45e-3335-4db0-b45a-b5270296ee35" (UID: "ba4da45e-3335-4db0-b45a-b5270296ee35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.141694 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.141730 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4da45e-3335-4db0-b45a-b5270296ee35-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.141742 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4da45e-3335-4db0-b45a-b5270296ee35-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.141753 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdwh2\" (UniqueName: \"kubernetes.io/projected/ba4da45e-3335-4db0-b45a-b5270296ee35-kube-api-access-bdwh2\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.169289 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.247044 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.503680 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b436487-c350-46d9-b1b8-75314d1d4605" path="/var/lib/kubelet/pods/2b436487-c350-46d9-b1b8-75314d1d4605/volumes" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.744787 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.752770 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fc26096-0b60-4cc8-9d34-a33991f7eae1","Type":"ContainerStarted","Data":"cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f"} Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.754993 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4da45e-3335-4db0-b45a-b5270296ee35","Type":"ContainerDied","Data":"e470297eff9b3b4e1f1c0e6de0c4873cc33200389591f49e114d7f7c6d3e4a67"} Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.755996 4763 scope.go:117] "RemoveContainer" containerID="38dd2ae4bc923388e12df617d9e41a05f8f14faba54341609c11baf3dfffbd25" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.756205 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.836951 4763 scope.go:117] "RemoveContainer" containerID="2972898c0e994d6be6707392b64e73292af89d7f46e2b233c34ccb236b53772b" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.846532 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.857494 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.879474 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 13:56:42 crc kubenswrapper[4763]: E0930 13:56:42.879865 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerName="nova-api-api" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.879881 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerName="nova-api-api" Sep 30 13:56:42 crc kubenswrapper[4763]: E0930 13:56:42.879893 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerName="nova-api-log" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.879899 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerName="nova-api-log" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.880139 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerName="nova-api-api" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.880166 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" containerName="nova-api-log" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.891413 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.900069 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:56:42 crc kubenswrapper[4763]: I0930 13:56:42.907419 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.074043 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-config-data\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.074155 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6dkt\" (UniqueName: \"kubernetes.io/projected/bd06976a-fec0-4550-a0b3-aa311a08cbd7-kube-api-access-w6dkt\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.074212 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.074252 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd06976a-fec0-4550-a0b3-aa311a08cbd7-logs\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.176723 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd06976a-fec0-4550-a0b3-aa311a08cbd7-logs\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.176953 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-config-data\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.177507 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd06976a-fec0-4550-a0b3-aa311a08cbd7-logs\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.178103 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6dkt\" (UniqueName: \"kubernetes.io/projected/bd06976a-fec0-4550-a0b3-aa311a08cbd7-kube-api-access-w6dkt\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.178173 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.181958 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.182530 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-config-data\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.196227 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6dkt\" (UniqueName: \"kubernetes.io/projected/bd06976a-fec0-4550-a0b3-aa311a08cbd7-kube-api-access-w6dkt\") pod \"nova-api-0\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.232077 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.694560 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:56:43 crc kubenswrapper[4763]: W0930 13:56:43.707562 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd06976a_fec0_4550_a0b3_aa311a08cbd7.slice/crio-5e8177658b46343f38f4aed92f8e6ed17efb2674950554d73b596e4c7dd9d0ba WatchSource:0}: Error finding container 5e8177658b46343f38f4aed92f8e6ed17efb2674950554d73b596e4c7dd9d0ba: Status 404 returned error can't find the container with id 5e8177658b46343f38f4aed92f8e6ed17efb2674950554d73b596e4c7dd9d0ba Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.764325 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd06976a-fec0-4550-a0b3-aa311a08cbd7","Type":"ContainerStarted","Data":"5e8177658b46343f38f4aed92f8e6ed17efb2674950554d73b596e4c7dd9d0ba"} Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.767607 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fc26096-0b60-4cc8-9d34-a33991f7eae1","Type":"ContainerStarted","Data":"36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90"} Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.770554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12ee2a46-828b-4494-a721-5c8e3e6c4fa3","Type":"ContainerStarted","Data":"4a281cb7a3c3e7a37caa4b764b005d08332aabdc751dd41fbd02bcd2966542ce"} Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.770579 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12ee2a46-828b-4494-a721-5c8e3e6c4fa3","Type":"ContainerStarted","Data":"e170c7f1eb630c6eef251bbd94dfa7f20020848bff35157a21d239b41295d58d"} Sep 30 13:56:43 crc kubenswrapper[4763]: I0930 13:56:43.801482 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8014599909999998 podStartE2EDuration="2.801459991s" podCreationTimestamp="2025-09-30 13:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:43.78591214 +0000 UTC m=+1275.924472425" watchObservedRunningTime="2025-09-30 13:56:43.801459991 +0000 UTC m=+1275.940020276" Sep 30 13:56:44 crc kubenswrapper[4763]: I0930 13:56:44.069501 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 13:56:44 crc kubenswrapper[4763]: I0930 13:56:44.069566 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 13:56:44 crc kubenswrapper[4763]: I0930 13:56:44.504775 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba4da45e-3335-4db0-b45a-b5270296ee35" path="/var/lib/kubelet/pods/ba4da45e-3335-4db0-b45a-b5270296ee35/volumes" Sep 30 13:56:44 crc kubenswrapper[4763]: I0930 13:56:44.781951 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fc26096-0b60-4cc8-9d34-a33991f7eae1","Type":"ContainerStarted","Data":"2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432"} Sep 30 13:56:44 crc kubenswrapper[4763]: I0930 13:56:44.784170 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd06976a-fec0-4550-a0b3-aa311a08cbd7","Type":"ContainerStarted","Data":"8cb90fd9f008eacc105c567131bc8f2f0c0739eead1d38d969f4abcd2591652e"} Sep 30 13:56:44 crc kubenswrapper[4763]: I0930 13:56:44.784231 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd06976a-fec0-4550-a0b3-aa311a08cbd7","Type":"ContainerStarted","Data":"21d12cee29d5fcafec5d59a600136caee93d6575f8ab4bafd39e24a24928fe9c"} Sep 30 13:56:44 crc kubenswrapper[4763]: I0930 13:56:44.808652 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.179756557 podStartE2EDuration="5.808633493s" podCreationTimestamp="2025-09-30 13:56:39 +0000 UTC" firstStartedPulling="2025-09-30 13:56:40.591862819 +0000 UTC m=+1272.730423104" lastFinishedPulling="2025-09-30 13:56:44.220739745 +0000 UTC m=+1276.359300040" observedRunningTime="2025-09-30 13:56:44.802069129 +0000 UTC m=+1276.940629414" watchObservedRunningTime="2025-09-30 13:56:44.808633493 +0000 UTC m=+1276.947193778" Sep 30 13:56:44 crc kubenswrapper[4763]: I0930 13:56:44.826073 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.826048609 podStartE2EDuration="2.826048609s" podCreationTimestamp="2025-09-30 13:56:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:44.823882046 +0000 UTC m=+1276.962442331" watchObservedRunningTime="2025-09-30 13:56:44.826048609 +0000 UTC m=+1276.964608894" Sep 30 13:56:45 crc kubenswrapper[4763]: I0930 13:56:45.794296 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:56:47 crc kubenswrapper[4763]: I0930 13:56:47.102105 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 13:56:47 crc kubenswrapper[4763]: I0930 13:56:47.247297 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 13:56:49 crc kubenswrapper[4763]: I0930 13:56:49.069515 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 13:56:49 crc kubenswrapper[4763]: I0930 13:56:49.069960 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 13:56:50 crc kubenswrapper[4763]: I0930 13:56:50.086891 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:56:50 crc kubenswrapper[4763]: I0930 13:56:50.086914 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:56:52 crc kubenswrapper[4763]: I0930 13:56:52.248084 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 13:56:52 crc kubenswrapper[4763]: I0930 13:56:52.277747 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 13:56:52 crc kubenswrapper[4763]: I0930 13:56:52.883635 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 13:56:53 crc kubenswrapper[4763]: I0930 13:56:53.232884 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:56:53 crc kubenswrapper[4763]: I0930 13:56:53.233232 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:56:54 crc kubenswrapper[4763]: I0930 13:56:54.315804 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:56:54 crc kubenswrapper[4763]: I0930 13:56:54.315794 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:56:59 crc kubenswrapper[4763]: I0930 13:56:59.075145 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 13:56:59 crc kubenswrapper[4763]: I0930 13:56:59.075692 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 13:56:59 crc kubenswrapper[4763]: I0930 13:56:59.082955 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 13:56:59 crc kubenswrapper[4763]: I0930 13:56:59.083305 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 13:56:59 crc kubenswrapper[4763]: I0930 13:56:59.915401 4763 generic.go:334] "Generic (PLEG): container finished" podID="c8b04d40-51cf-4a31-b70c-bd760d800aca" containerID="272acd02c53a37dfb9729b5a518d4b01ce95a61821dc87ad8baac1d2c9db8bfa" exitCode=137 Sep 30 13:56:59 crc kubenswrapper[4763]: I0930 13:56:59.915570 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c8b04d40-51cf-4a31-b70c-bd760d800aca","Type":"ContainerDied","Data":"272acd02c53a37dfb9729b5a518d4b01ce95a61821dc87ad8baac1d2c9db8bfa"} Sep 30 13:56:59 crc kubenswrapper[4763]: I0930 13:56:59.915754 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c8b04d40-51cf-4a31-b70c-bd760d800aca","Type":"ContainerDied","Data":"76594ab690d338ab63d5bc1c07db9e063729a0647426be38bb7052e6c384d75a"} Sep 30 13:56:59 crc kubenswrapper[4763]: I0930 13:56:59.915775 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76594ab690d338ab63d5bc1c07db9e063729a0647426be38bb7052e6c384d75a" Sep 30 13:56:59 crc kubenswrapper[4763]: I0930 13:56:59.924323 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.003414 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-config-data\") pod \"c8b04d40-51cf-4a31-b70c-bd760d800aca\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.003594 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mpwf\" (UniqueName: \"kubernetes.io/projected/c8b04d40-51cf-4a31-b70c-bd760d800aca-kube-api-access-4mpwf\") pod \"c8b04d40-51cf-4a31-b70c-bd760d800aca\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.003759 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-combined-ca-bundle\") pod \"c8b04d40-51cf-4a31-b70c-bd760d800aca\" (UID: \"c8b04d40-51cf-4a31-b70c-bd760d800aca\") " Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.011862 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b04d40-51cf-4a31-b70c-bd760d800aca-kube-api-access-4mpwf" (OuterVolumeSpecName: "kube-api-access-4mpwf") pod "c8b04d40-51cf-4a31-b70c-bd760d800aca" (UID: "c8b04d40-51cf-4a31-b70c-bd760d800aca"). InnerVolumeSpecName "kube-api-access-4mpwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.036156 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8b04d40-51cf-4a31-b70c-bd760d800aca" (UID: "c8b04d40-51cf-4a31-b70c-bd760d800aca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.036858 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-config-data" (OuterVolumeSpecName: "config-data") pod "c8b04d40-51cf-4a31-b70c-bd760d800aca" (UID: "c8b04d40-51cf-4a31-b70c-bd760d800aca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.106296 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.106335 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b04d40-51cf-4a31-b70c-bd760d800aca-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.106353 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mpwf\" (UniqueName: \"kubernetes.io/projected/c8b04d40-51cf-4a31-b70c-bd760d800aca-kube-api-access-4mpwf\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.923742 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.944201 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.953311 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.971863 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:57:00 crc kubenswrapper[4763]: E0930 13:57:00.972358 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b04d40-51cf-4a31-b70c-bd760d800aca" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.972382 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b04d40-51cf-4a31-b70c-bd760d800aca" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.972587 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b04d40-51cf-4a31-b70c-bd760d800aca" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.973256 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.975916 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.976148 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.978215 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 13:57:00 crc kubenswrapper[4763]: I0930 13:57:00.992551 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.126183 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.126266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sktq\" (UniqueName: \"kubernetes.io/projected/783d0307-40e6-4d1e-9728-b1fe356e6b52-kube-api-access-4sktq\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.126442 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.126480 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.126520 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.228688 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.228767 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.228841 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.228879 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sktq\" (UniqueName: \"kubernetes.io/projected/783d0307-40e6-4d1e-9728-b1fe356e6b52-kube-api-access-4sktq\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.229008 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.241577 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.242572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.242854 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.243092 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.253208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sktq\" (UniqueName: \"kubernetes.io/projected/783d0307-40e6-4d1e-9728-b1fe356e6b52-kube-api-access-4sktq\") pod \"nova-cell1-novncproxy-0\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.292046 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.779741 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:57:01 crc kubenswrapper[4763]: W0930 13:57:01.783756 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod783d0307_40e6_4d1e_9728_b1fe356e6b52.slice/crio-f2eb8078d6b11977d27a5d4c8403e9ce60697f994c479ce5edef18d8ebe98011 WatchSource:0}: Error finding container f2eb8078d6b11977d27a5d4c8403e9ce60697f994c479ce5edef18d8ebe98011: Status 404 returned error can't find the container with id f2eb8078d6b11977d27a5d4c8403e9ce60697f994c479ce5edef18d8ebe98011 Sep 30 13:57:01 crc kubenswrapper[4763]: I0930 13:57:01.936960 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"783d0307-40e6-4d1e-9728-b1fe356e6b52","Type":"ContainerStarted","Data":"f2eb8078d6b11977d27a5d4c8403e9ce60697f994c479ce5edef18d8ebe98011"} Sep 30 13:57:02 crc kubenswrapper[4763]: I0930 13:57:02.500576 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b04d40-51cf-4a31-b70c-bd760d800aca" path="/var/lib/kubelet/pods/c8b04d40-51cf-4a31-b70c-bd760d800aca/volumes" Sep 30 13:57:02 crc kubenswrapper[4763]: I0930 13:57:02.950314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"783d0307-40e6-4d1e-9728-b1fe356e6b52","Type":"ContainerStarted","Data":"4f8c5e6c6bac428024dc97ceaef682e62b42b67d2f61d3a18743765dbbf6718d"} Sep 30 13:57:02 crc kubenswrapper[4763]: I0930 13:57:02.968438 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.968419166 podStartE2EDuration="2.968419166s" podCreationTimestamp="2025-09-30 13:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:02.966098497 +0000 UTC m=+1295.104658792" watchObservedRunningTime="2025-09-30 13:57:02.968419166 +0000 UTC m=+1295.106979451" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.241475 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.241569 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.242159 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.242240 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.245498 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.246047 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.440262 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-ktclz"] Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.442831 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.462763 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-ktclz"] Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.586009 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-config\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.586071 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-sb\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.586268 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-svc\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.586389 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2k7f\" (UniqueName: \"kubernetes.io/projected/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-kube-api-access-h2k7f\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.586486 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-swift-storage-0\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.586753 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-nb\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.689211 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-config\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.689270 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-sb\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.689319 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-svc\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.689352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2k7f\" (UniqueName: \"kubernetes.io/projected/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-kube-api-access-h2k7f\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.689387 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-swift-storage-0\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.689437 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-nb\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.690480 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-config\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.690497 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-svc\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.690511 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-sb\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.690619 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-swift-storage-0\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.690647 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-nb\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.710786 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2k7f\" (UniqueName: \"kubernetes.io/projected/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-kube-api-access-h2k7f\") pod \"dnsmasq-dns-cc449b9dc-ktclz\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:03 crc kubenswrapper[4763]: I0930 13:57:03.775363 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:04 crc kubenswrapper[4763]: I0930 13:57:04.374594 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-ktclz"] Sep 30 13:57:04 crc kubenswrapper[4763]: W0930 13:57:04.383090 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ef00d68_6c21_4ee7_8be8_53f7c1edb2f3.slice/crio-99dd7c5c327df9d137d54643d744320fd2f62ce8f9b98f591caa3b4183e23a8a WatchSource:0}: Error finding container 99dd7c5c327df9d137d54643d744320fd2f62ce8f9b98f591caa3b4183e23a8a: Status 404 returned error can't find the container with id 99dd7c5c327df9d137d54643d744320fd2f62ce8f9b98f591caa3b4183e23a8a Sep 30 13:57:04 crc kubenswrapper[4763]: I0930 13:57:04.988465 4763 generic.go:334] "Generic (PLEG): container finished" podID="2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" containerID="2589b94c1779aa45bfe6bc84dd80bb5efa5a6559e17433f8f4c0ba2be4f7b26d" exitCode=0 Sep 30 13:57:04 crc kubenswrapper[4763]: I0930 13:57:04.989748 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" event={"ID":"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3","Type":"ContainerDied","Data":"2589b94c1779aa45bfe6bc84dd80bb5efa5a6559e17433f8f4c0ba2be4f7b26d"} Sep 30 13:57:04 crc kubenswrapper[4763]: I0930 13:57:04.989960 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" event={"ID":"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3","Type":"ContainerStarted","Data":"99dd7c5c327df9d137d54643d744320fd2f62ce8f9b98f591caa3b4183e23a8a"} Sep 30 13:57:05 crc kubenswrapper[4763]: I0930 13:57:05.664173 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:57:05 crc kubenswrapper[4763]: I0930 13:57:05.665273 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="ceilometer-central-agent" containerID="cri-o://4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05" gracePeriod=30 Sep 30 13:57:05 crc kubenswrapper[4763]: I0930 13:57:05.665398 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="sg-core" containerID="cri-o://36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90" gracePeriod=30 Sep 30 13:57:05 crc kubenswrapper[4763]: I0930 13:57:05.665492 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="ceilometer-notification-agent" containerID="cri-o://cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f" gracePeriod=30 Sep 30 13:57:05 crc kubenswrapper[4763]: I0930 13:57:05.665561 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="proxy-httpd" containerID="cri-o://2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432" gracePeriod=30 Sep 30 13:57:05 crc kubenswrapper[4763]: I0930 13:57:05.674203 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.194:3000/\": EOF" Sep 30 13:57:05 crc kubenswrapper[4763]: I0930 13:57:05.999119 4763 generic.go:334] "Generic (PLEG): container finished" podID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerID="2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432" exitCode=0 Sep 30 13:57:05 crc kubenswrapper[4763]: I0930 13:57:05.999152 4763 generic.go:334] "Generic (PLEG): container finished" podID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerID="36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90" exitCode=2 Sep 30 13:57:05 crc kubenswrapper[4763]: I0930 13:57:05.999194 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fc26096-0b60-4cc8-9d34-a33991f7eae1","Type":"ContainerDied","Data":"2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432"} Sep 30 13:57:05 crc kubenswrapper[4763]: I0930 13:57:05.999249 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fc26096-0b60-4cc8-9d34-a33991f7eae1","Type":"ContainerDied","Data":"36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90"} Sep 30 13:57:06 crc kubenswrapper[4763]: I0930 13:57:06.001089 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" event={"ID":"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3","Type":"ContainerStarted","Data":"f18e050afde37900d0b00f1f42394f96b83e1b630126fc3ff1f6312b776bc3ae"} Sep 30 13:57:06 crc kubenswrapper[4763]: I0930 13:57:06.001282 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:06 crc kubenswrapper[4763]: I0930 13:57:06.026391 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" podStartSLOduration=3.026372498 podStartE2EDuration="3.026372498s" podCreationTimestamp="2025-09-30 13:57:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:06.02085961 +0000 UTC m=+1298.159419895" watchObservedRunningTime="2025-09-30 13:57:06.026372498 +0000 UTC m=+1298.164932783" Sep 30 13:57:06 crc kubenswrapper[4763]: I0930 13:57:06.060343 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:57:06 crc kubenswrapper[4763]: I0930 13:57:06.060406 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:57:06 crc kubenswrapper[4763]: I0930 13:57:06.292757 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:06 crc kubenswrapper[4763]: I0930 13:57:06.656749 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:06 crc kubenswrapper[4763]: I0930 13:57:06.657067 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerName="nova-api-api" containerID="cri-o://8cb90fd9f008eacc105c567131bc8f2f0c0739eead1d38d969f4abcd2591652e" gracePeriod=30 Sep 30 13:57:06 crc kubenswrapper[4763]: I0930 13:57:06.658122 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerName="nova-api-log" containerID="cri-o://21d12cee29d5fcafec5d59a600136caee93d6575f8ab4bafd39e24a24928fe9c" gracePeriod=30 Sep 30 13:57:07 crc kubenswrapper[4763]: I0930 13:57:07.011473 4763 generic.go:334] "Generic (PLEG): container finished" podID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerID="21d12cee29d5fcafec5d59a600136caee93d6575f8ab4bafd39e24a24928fe9c" exitCode=143 Sep 30 13:57:07 crc kubenswrapper[4763]: I0930 13:57:07.011756 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd06976a-fec0-4550-a0b3-aa311a08cbd7","Type":"ContainerDied","Data":"21d12cee29d5fcafec5d59a600136caee93d6575f8ab4bafd39e24a24928fe9c"} Sep 30 13:57:07 crc kubenswrapper[4763]: I0930 13:57:07.015135 4763 generic.go:334] "Generic (PLEG): container finished" podID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerID="4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05" exitCode=0 Sep 30 13:57:07 crc kubenswrapper[4763]: I0930 13:57:07.015203 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fc26096-0b60-4cc8-9d34-a33991f7eae1","Type":"ContainerDied","Data":"4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05"} Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.843510 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.998140 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-log-httpd\") pod \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.998206 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kbhq\" (UniqueName: \"kubernetes.io/projected/2fc26096-0b60-4cc8-9d34-a33991f7eae1-kube-api-access-4kbhq\") pod \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.998253 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-config-data\") pod \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.998276 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-run-httpd\") pod \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.998311 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-scripts\") pod \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.998329 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-combined-ca-bundle\") pod \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.998467 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-ceilometer-tls-certs\") pod \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.998586 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-sg-core-conf-yaml\") pod \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\" (UID: \"2fc26096-0b60-4cc8-9d34-a33991f7eae1\") " Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.998665 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2fc26096-0b60-4cc8-9d34-a33991f7eae1" (UID: "2fc26096-0b60-4cc8-9d34-a33991f7eae1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.998952 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:08 crc kubenswrapper[4763]: I0930 13:57:08.999317 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2fc26096-0b60-4cc8-9d34-a33991f7eae1" (UID: "2fc26096-0b60-4cc8-9d34-a33991f7eae1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.005654 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-scripts" (OuterVolumeSpecName: "scripts") pod "2fc26096-0b60-4cc8-9d34-a33991f7eae1" (UID: "2fc26096-0b60-4cc8-9d34-a33991f7eae1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.007345 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fc26096-0b60-4cc8-9d34-a33991f7eae1-kube-api-access-4kbhq" (OuterVolumeSpecName: "kube-api-access-4kbhq") pod "2fc26096-0b60-4cc8-9d34-a33991f7eae1" (UID: "2fc26096-0b60-4cc8-9d34-a33991f7eae1"). InnerVolumeSpecName "kube-api-access-4kbhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.034012 4763 generic.go:334] "Generic (PLEG): container finished" podID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerID="cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f" exitCode=0 Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.034052 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fc26096-0b60-4cc8-9d34-a33991f7eae1","Type":"ContainerDied","Data":"cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f"} Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.034080 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fc26096-0b60-4cc8-9d34-a33991f7eae1","Type":"ContainerDied","Data":"eabb5f4f6260d51bc0c39c60650d0d21b81c7c23dae4c5aef5b42d84d2d6a9a5"} Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.034097 4763 scope.go:117] "RemoveContainer" containerID="2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.034095 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.038863 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2fc26096-0b60-4cc8-9d34-a33991f7eae1" (UID: "2fc26096-0b60-4cc8-9d34-a33991f7eae1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.068105 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2fc26096-0b60-4cc8-9d34-a33991f7eae1" (UID: "2fc26096-0b60-4cc8-9d34-a33991f7eae1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.107690 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.107965 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kbhq\" (UniqueName: \"kubernetes.io/projected/2fc26096-0b60-4cc8-9d34-a33991f7eae1-kube-api-access-4kbhq\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.108061 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fc26096-0b60-4cc8-9d34-a33991f7eae1-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.108151 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.108235 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.120612 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fc26096-0b60-4cc8-9d34-a33991f7eae1" (UID: "2fc26096-0b60-4cc8-9d34-a33991f7eae1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.148378 4763 scope.go:117] "RemoveContainer" containerID="36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.151555 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-config-data" (OuterVolumeSpecName: "config-data") pod "2fc26096-0b60-4cc8-9d34-a33991f7eae1" (UID: "2fc26096-0b60-4cc8-9d34-a33991f7eae1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.166342 4763 scope.go:117] "RemoveContainer" containerID="cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.187285 4763 scope.go:117] "RemoveContainer" containerID="4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.207113 4763 scope.go:117] "RemoveContainer" containerID="2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.210818 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.211005 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc26096-0b60-4cc8-9d34-a33991f7eae1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:09 crc kubenswrapper[4763]: E0930 13:57:09.211162 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432\": container with ID starting with 2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432 not found: ID does not exist" containerID="2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.211267 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432"} err="failed to get container status \"2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432\": rpc error: code = NotFound desc = could not find container \"2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432\": container with ID starting with 2cdb32cdc36eb9ceeaba9d02c102857664ac14ad4c3fcc82f431109611ba4432 not found: ID does not exist" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.211371 4763 scope.go:117] "RemoveContainer" containerID="36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90" Sep 30 13:57:09 crc kubenswrapper[4763]: E0930 13:57:09.211899 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90\": container with ID starting with 36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90 not found: ID does not exist" containerID="36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.211928 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90"} err="failed to get container status \"36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90\": rpc error: code = NotFound desc = could not find container \"36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90\": container with ID starting with 36ff52ae1744b1e0782558f479bf2f4df94fa9630094947877263026b6a6ca90 not found: ID does not exist" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.211946 4763 scope.go:117] "RemoveContainer" containerID="cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f" Sep 30 13:57:09 crc kubenswrapper[4763]: E0930 13:57:09.213282 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f\": container with ID starting with cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f not found: ID does not exist" containerID="cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.213312 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f"} err="failed to get container status \"cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f\": rpc error: code = NotFound desc = could not find container \"cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f\": container with ID starting with cfb2b7c632f87b35a1df92d8602485a609e7a6c28d9adc078ce6dfd597b5d00f not found: ID does not exist" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.213330 4763 scope.go:117] "RemoveContainer" containerID="4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05" Sep 30 13:57:09 crc kubenswrapper[4763]: E0930 13:57:09.213840 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05\": container with ID starting with 4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05 not found: ID does not exist" containerID="4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.213962 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05"} err="failed to get container status \"4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05\": rpc error: code = NotFound desc = could not find container \"4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05\": container with ID starting with 4824737c2eeb0b1d7f031500b6c04790ec65d78a3d761ebc9f7f3809d3c66f05 not found: ID does not exist" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.379767 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.393898 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.405170 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:57:09 crc kubenswrapper[4763]: E0930 13:57:09.405788 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="ceilometer-central-agent" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.405824 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="ceilometer-central-agent" Sep 30 13:57:09 crc kubenswrapper[4763]: E0930 13:57:09.405847 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="proxy-httpd" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.405856 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="proxy-httpd" Sep 30 13:57:09 crc kubenswrapper[4763]: E0930 13:57:09.405872 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="ceilometer-notification-agent" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.405882 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="ceilometer-notification-agent" Sep 30 13:57:09 crc kubenswrapper[4763]: E0930 13:57:09.405914 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="sg-core" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.405921 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="sg-core" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.406171 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="ceilometer-central-agent" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.406192 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="sg-core" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.406213 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="ceilometer-notification-agent" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.406226 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" containerName="proxy-httpd" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.409805 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.413054 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.413332 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.420025 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.420244 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.516113 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.516177 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-scripts\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.516390 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.516531 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-log-httpd\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.516559 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.516716 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-config-data\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.516796 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-run-httpd\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.516910 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5g7d\" (UniqueName: \"kubernetes.io/projected/dafb3edf-a4c0-4131-ad09-f836de63ff6b-kube-api-access-v5g7d\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.619800 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-run-httpd\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.619961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5g7d\" (UniqueName: \"kubernetes.io/projected/dafb3edf-a4c0-4131-ad09-f836de63ff6b-kube-api-access-v5g7d\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.620036 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.620081 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-scripts\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.620216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.620267 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-run-httpd\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.620295 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-log-httpd\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.620362 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.620439 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-config-data\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.620978 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-log-httpd\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.623843 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.624266 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.624495 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.625556 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-scripts\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.633143 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-config-data\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.638096 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5g7d\" (UniqueName: \"kubernetes.io/projected/dafb3edf-a4c0-4131-ad09-f836de63ff6b-kube-api-access-v5g7d\") pod \"ceilometer-0\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " pod="openstack/ceilometer-0" Sep 30 13:57:09 crc kubenswrapper[4763]: I0930 13:57:09.737440 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.054381 4763 generic.go:334] "Generic (PLEG): container finished" podID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerID="8cb90fd9f008eacc105c567131bc8f2f0c0739eead1d38d969f4abcd2591652e" exitCode=0 Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.054909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd06976a-fec0-4550-a0b3-aa311a08cbd7","Type":"ContainerDied","Data":"8cb90fd9f008eacc105c567131bc8f2f0c0739eead1d38d969f4abcd2591652e"} Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.239652 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.240478 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.332717 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-combined-ca-bundle\") pod \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.332785 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6dkt\" (UniqueName: \"kubernetes.io/projected/bd06976a-fec0-4550-a0b3-aa311a08cbd7-kube-api-access-w6dkt\") pod \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.332855 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-config-data\") pod \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.332974 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd06976a-fec0-4550-a0b3-aa311a08cbd7-logs\") pod \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\" (UID: \"bd06976a-fec0-4550-a0b3-aa311a08cbd7\") " Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.333769 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd06976a-fec0-4550-a0b3-aa311a08cbd7-logs" (OuterVolumeSpecName: "logs") pod "bd06976a-fec0-4550-a0b3-aa311a08cbd7" (UID: "bd06976a-fec0-4550-a0b3-aa311a08cbd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.338826 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd06976a-fec0-4550-a0b3-aa311a08cbd7-kube-api-access-w6dkt" (OuterVolumeSpecName: "kube-api-access-w6dkt") pod "bd06976a-fec0-4550-a0b3-aa311a08cbd7" (UID: "bd06976a-fec0-4550-a0b3-aa311a08cbd7"). InnerVolumeSpecName "kube-api-access-w6dkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.376771 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd06976a-fec0-4550-a0b3-aa311a08cbd7" (UID: "bd06976a-fec0-4550-a0b3-aa311a08cbd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.390072 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-config-data" (OuterVolumeSpecName: "config-data") pod "bd06976a-fec0-4550-a0b3-aa311a08cbd7" (UID: "bd06976a-fec0-4550-a0b3-aa311a08cbd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.434755 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6dkt\" (UniqueName: \"kubernetes.io/projected/bd06976a-fec0-4550-a0b3-aa311a08cbd7-kube-api-access-w6dkt\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.434785 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.434795 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd06976a-fec0-4550-a0b3-aa311a08cbd7-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.434806 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd06976a-fec0-4550-a0b3-aa311a08cbd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:10 crc kubenswrapper[4763]: I0930 13:57:10.504196 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fc26096-0b60-4cc8-9d34-a33991f7eae1" path="/var/lib/kubelet/pods/2fc26096-0b60-4cc8-9d34-a33991f7eae1/volumes" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.066428 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dafb3edf-a4c0-4131-ad09-f836de63ff6b","Type":"ContainerStarted","Data":"206bc2e365fe97699afc965b9cd36f62f6b07203198419a0dace4fa2fc38a5cf"} Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.068698 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd06976a-fec0-4550-a0b3-aa311a08cbd7","Type":"ContainerDied","Data":"5e8177658b46343f38f4aed92f8e6ed17efb2674950554d73b596e4c7dd9d0ba"} Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.068797 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.068866 4763 scope.go:117] "RemoveContainer" containerID="8cb90fd9f008eacc105c567131bc8f2f0c0739eead1d38d969f4abcd2591652e" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.092007 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.092993 4763 scope.go:117] "RemoveContainer" containerID="21d12cee29d5fcafec5d59a600136caee93d6575f8ab4bafd39e24a24928fe9c" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.099633 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.115264 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:11 crc kubenswrapper[4763]: E0930 13:57:11.115661 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerName="nova-api-api" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.115678 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerName="nova-api-api" Sep 30 13:57:11 crc kubenswrapper[4763]: E0930 13:57:11.115688 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerName="nova-api-log" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.115695 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerName="nova-api-log" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.115891 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerName="nova-api-api" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.115912 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" containerName="nova-api-log" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.116880 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.119079 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.119134 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.124243 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.147142 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-config-data\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.147206 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.147249 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-public-tls-certs\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.147316 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.147354 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c517f7af-3e33-4453-b727-1a1c458828d0-logs\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.147405 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvrvr\" (UniqueName: \"kubernetes.io/projected/c517f7af-3e33-4453-b727-1a1c458828d0-kube-api-access-vvrvr\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.163259 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.249077 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-config-data\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.249146 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.249190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-public-tls-certs\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.249220 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.249258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c517f7af-3e33-4453-b727-1a1c458828d0-logs\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.249310 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvrvr\" (UniqueName: \"kubernetes.io/projected/c517f7af-3e33-4453-b727-1a1c458828d0-kube-api-access-vvrvr\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.251161 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c517f7af-3e33-4453-b727-1a1c458828d0-logs\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.255881 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.256056 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-config-data\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.256788 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.270454 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-public-tls-certs\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.278210 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvrvr\" (UniqueName: \"kubernetes.io/projected/c517f7af-3e33-4453-b727-1a1c458828d0-kube-api-access-vvrvr\") pod \"nova-api-0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.293245 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.315395 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.346853 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:57:11 crc kubenswrapper[4763]: I0930 13:57:11.845053 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:11 crc kubenswrapper[4763]: W0930 13:57:11.851066 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc517f7af_3e33_4453_b727_1a1c458828d0.slice/crio-d3a77b7c83674a5c7069dcad1ba0f9008ca4905004a8f32015b913e71266d3f8 WatchSource:0}: Error finding container d3a77b7c83674a5c7069dcad1ba0f9008ca4905004a8f32015b913e71266d3f8: Status 404 returned error can't find the container with id d3a77b7c83674a5c7069dcad1ba0f9008ca4905004a8f32015b913e71266d3f8 Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.088107 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dafb3edf-a4c0-4131-ad09-f836de63ff6b","Type":"ContainerStarted","Data":"4050fb5fd9697e750ea813cde28a4d185f69fe05aa260d270466a73fd43cd815"} Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.089556 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c517f7af-3e33-4453-b727-1a1c458828d0","Type":"ContainerStarted","Data":"d3a77b7c83674a5c7069dcad1ba0f9008ca4905004a8f32015b913e71266d3f8"} Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.106756 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.316306 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ftn4t"] Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.318364 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.320923 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.324043 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ftn4t"] Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.325792 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.382441 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-scripts\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.382502 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4ph9\" (UniqueName: \"kubernetes.io/projected/074a20c5-bc97-4a4b-9f11-60c63250120a-kube-api-access-j4ph9\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.382549 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-config-data\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.382778 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.489674 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ph9\" (UniqueName: \"kubernetes.io/projected/074a20c5-bc97-4a4b-9f11-60c63250120a-kube-api-access-j4ph9\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.490515 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-config-data\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.491227 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.491443 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-scripts\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.495510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-scripts\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.495570 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.496356 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-config-data\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.502415 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd06976a-fec0-4550-a0b3-aa311a08cbd7" path="/var/lib/kubelet/pods/bd06976a-fec0-4550-a0b3-aa311a08cbd7/volumes" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.507277 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4ph9\" (UniqueName: \"kubernetes.io/projected/074a20c5-bc97-4a4b-9f11-60c63250120a-kube-api-access-j4ph9\") pod \"nova-cell1-cell-mapping-ftn4t\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:12 crc kubenswrapper[4763]: I0930 13:57:12.643970 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:13 crc kubenswrapper[4763]: I0930 13:57:13.152426 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c517f7af-3e33-4453-b727-1a1c458828d0","Type":"ContainerStarted","Data":"78a30fa0cb886d8732c7e08de0fab1cf5a4d537f41b538998049c0d6511e87fe"} Sep 30 13:57:13 crc kubenswrapper[4763]: I0930 13:57:13.210014 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ftn4t"] Sep 30 13:57:13 crc kubenswrapper[4763]: I0930 13:57:13.776770 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:57:13 crc kubenswrapper[4763]: I0930 13:57:13.830224 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-4hcqr"] Sep 30 13:57:13 crc kubenswrapper[4763]: I0930 13:57:13.830471 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" podUID="db68026f-4475-4db7-a942-3ba486623fc5" containerName="dnsmasq-dns" containerID="cri-o://d6a9a0e305070b8965c1c9f3814820c8d0564d3560a0e9623089e72d04b7cb8d" gracePeriod=10 Sep 30 13:57:14 crc kubenswrapper[4763]: I0930 13:57:14.138278 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" podUID="db68026f-4475-4db7-a942-3ba486623fc5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.188:5353: connect: connection refused" Sep 30 13:57:14 crc kubenswrapper[4763]: I0930 13:57:14.176911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dafb3edf-a4c0-4131-ad09-f836de63ff6b","Type":"ContainerStarted","Data":"9f9834fea4bcdd3ebd6df5293d9dbc06c9286884eddc735ba525f57b0060ea83"} Sep 30 13:57:14 crc kubenswrapper[4763]: I0930 13:57:14.180649 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c517f7af-3e33-4453-b727-1a1c458828d0","Type":"ContainerStarted","Data":"56b368c578edab190400d0294d0111cac90771b3cc7d82995d2aa31c46c9525c"} Sep 30 13:57:14 crc kubenswrapper[4763]: I0930 13:57:14.185933 4763 generic.go:334] "Generic (PLEG): container finished" podID="db68026f-4475-4db7-a942-3ba486623fc5" containerID="d6a9a0e305070b8965c1c9f3814820c8d0564d3560a0e9623089e72d04b7cb8d" exitCode=0 Sep 30 13:57:14 crc kubenswrapper[4763]: I0930 13:57:14.186058 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" event={"ID":"db68026f-4475-4db7-a942-3ba486623fc5","Type":"ContainerDied","Data":"d6a9a0e305070b8965c1c9f3814820c8d0564d3560a0e9623089e72d04b7cb8d"} Sep 30 13:57:14 crc kubenswrapper[4763]: I0930 13:57:14.193261 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ftn4t" event={"ID":"074a20c5-bc97-4a4b-9f11-60c63250120a","Type":"ContainerStarted","Data":"544efce59b72529479a217ea6778472f68eff77d359bf446afeaeaba8a02a141"} Sep 30 13:57:14 crc kubenswrapper[4763]: I0930 13:57:14.193307 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ftn4t" event={"ID":"074a20c5-bc97-4a4b-9f11-60c63250120a","Type":"ContainerStarted","Data":"d561dccab5daa916cd4786ac205774d11f79c6fbdf2b993f7abf665bd35598b5"} Sep 30 13:57:14 crc kubenswrapper[4763]: I0930 13:57:14.206883 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.206857485 podStartE2EDuration="3.206857485s" podCreationTimestamp="2025-09-30 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:14.202136447 +0000 UTC m=+1306.340696742" watchObservedRunningTime="2025-09-30 13:57:14.206857485 +0000 UTC m=+1306.345417780" Sep 30 13:57:14 crc kubenswrapper[4763]: I0930 13:57:14.228160 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ftn4t" podStartSLOduration=2.228142349 podStartE2EDuration="2.228142349s" podCreationTimestamp="2025-09-30 13:57:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:14.225037771 +0000 UTC m=+1306.363598056" watchObservedRunningTime="2025-09-30 13:57:14.228142349 +0000 UTC m=+1306.366702624" Sep 30 13:57:14 crc kubenswrapper[4763]: I0930 13:57:14.972382 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.077726 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pjf2\" (UniqueName: \"kubernetes.io/projected/db68026f-4475-4db7-a942-3ba486623fc5-kube-api-access-2pjf2\") pod \"db68026f-4475-4db7-a942-3ba486623fc5\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.078129 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-config\") pod \"db68026f-4475-4db7-a942-3ba486623fc5\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.078244 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-nb\") pod \"db68026f-4475-4db7-a942-3ba486623fc5\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.078259 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-svc\") pod \"db68026f-4475-4db7-a942-3ba486623fc5\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.078304 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-swift-storage-0\") pod \"db68026f-4475-4db7-a942-3ba486623fc5\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.078324 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-sb\") pod \"db68026f-4475-4db7-a942-3ba486623fc5\" (UID: \"db68026f-4475-4db7-a942-3ba486623fc5\") " Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.082434 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db68026f-4475-4db7-a942-3ba486623fc5-kube-api-access-2pjf2" (OuterVolumeSpecName: "kube-api-access-2pjf2") pod "db68026f-4475-4db7-a942-3ba486623fc5" (UID: "db68026f-4475-4db7-a942-3ba486623fc5"). InnerVolumeSpecName "kube-api-access-2pjf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.137163 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db68026f-4475-4db7-a942-3ba486623fc5" (UID: "db68026f-4475-4db7-a942-3ba486623fc5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.142243 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "db68026f-4475-4db7-a942-3ba486623fc5" (UID: "db68026f-4475-4db7-a942-3ba486623fc5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.146654 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "db68026f-4475-4db7-a942-3ba486623fc5" (UID: "db68026f-4475-4db7-a942-3ba486623fc5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.155451 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-config" (OuterVolumeSpecName: "config") pod "db68026f-4475-4db7-a942-3ba486623fc5" (UID: "db68026f-4475-4db7-a942-3ba486623fc5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.156807 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "db68026f-4475-4db7-a942-3ba486623fc5" (UID: "db68026f-4475-4db7-a942-3ba486623fc5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.183423 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pjf2\" (UniqueName: \"kubernetes.io/projected/db68026f-4475-4db7-a942-3ba486623fc5-kube-api-access-2pjf2\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.183463 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.183477 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.183486 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.183496 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.183504 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db68026f-4475-4db7-a942-3ba486623fc5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.210342 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" event={"ID":"db68026f-4475-4db7-a942-3ba486623fc5","Type":"ContainerDied","Data":"43382424ab8195f91b66c3713af9159c5287bdaac738a2651285cdc26db0a5d8"} Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.210429 4763 scope.go:117] "RemoveContainer" containerID="d6a9a0e305070b8965c1c9f3814820c8d0564d3560a0e9623089e72d04b7cb8d" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.211055 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9cc4c77f-4hcqr" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.236188 4763 scope.go:117] "RemoveContainer" containerID="e2fe3c664a0371aca61ca711621315d51b62700f060d37a7874df9f529fec7e9" Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.280773 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-4hcqr"] Sep 30 13:57:15 crc kubenswrapper[4763]: I0930 13:57:15.286326 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-4hcqr"] Sep 30 13:57:16 crc kubenswrapper[4763]: I0930 13:57:16.222517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dafb3edf-a4c0-4131-ad09-f836de63ff6b","Type":"ContainerStarted","Data":"e4e037d3269bde82dabfcdad3ed6cb4f0b5a8c6d24b989ac4a2b7ef124409e4b"} Sep 30 13:57:16 crc kubenswrapper[4763]: I0930 13:57:16.506677 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db68026f-4475-4db7-a942-3ba486623fc5" path="/var/lib/kubelet/pods/db68026f-4475-4db7-a942-3ba486623fc5/volumes" Sep 30 13:57:18 crc kubenswrapper[4763]: I0930 13:57:18.245297 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dafb3edf-a4c0-4131-ad09-f836de63ff6b","Type":"ContainerStarted","Data":"853ee2a8bbade3ee4f3a22fb7a290fda5a987ec05e7c37f8d30f4d981573d587"} Sep 30 13:57:18 crc kubenswrapper[4763]: I0930 13:57:18.246040 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:57:18 crc kubenswrapper[4763]: I0930 13:57:18.272018 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.332504234 podStartE2EDuration="9.272000911s" podCreationTimestamp="2025-09-30 13:57:09 +0000 UTC" firstStartedPulling="2025-09-30 13:57:10.253261155 +0000 UTC m=+1302.391821440" lastFinishedPulling="2025-09-30 13:57:17.192757812 +0000 UTC m=+1309.331318117" observedRunningTime="2025-09-30 13:57:18.266849482 +0000 UTC m=+1310.405409787" watchObservedRunningTime="2025-09-30 13:57:18.272000911 +0000 UTC m=+1310.410561196" Sep 30 13:57:19 crc kubenswrapper[4763]: I0930 13:57:19.256527 4763 generic.go:334] "Generic (PLEG): container finished" podID="074a20c5-bc97-4a4b-9f11-60c63250120a" containerID="544efce59b72529479a217ea6778472f68eff77d359bf446afeaeaba8a02a141" exitCode=0 Sep 30 13:57:19 crc kubenswrapper[4763]: I0930 13:57:19.256631 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ftn4t" event={"ID":"074a20c5-bc97-4a4b-9f11-60c63250120a","Type":"ContainerDied","Data":"544efce59b72529479a217ea6778472f68eff77d359bf446afeaeaba8a02a141"} Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.594728 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.687833 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-combined-ca-bundle\") pod \"074a20c5-bc97-4a4b-9f11-60c63250120a\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.688113 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-config-data\") pod \"074a20c5-bc97-4a4b-9f11-60c63250120a\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.688256 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4ph9\" (UniqueName: \"kubernetes.io/projected/074a20c5-bc97-4a4b-9f11-60c63250120a-kube-api-access-j4ph9\") pod \"074a20c5-bc97-4a4b-9f11-60c63250120a\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.688482 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-scripts\") pod \"074a20c5-bc97-4a4b-9f11-60c63250120a\" (UID: \"074a20c5-bc97-4a4b-9f11-60c63250120a\") " Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.693205 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-scripts" (OuterVolumeSpecName: "scripts") pod "074a20c5-bc97-4a4b-9f11-60c63250120a" (UID: "074a20c5-bc97-4a4b-9f11-60c63250120a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.698854 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/074a20c5-bc97-4a4b-9f11-60c63250120a-kube-api-access-j4ph9" (OuterVolumeSpecName: "kube-api-access-j4ph9") pod "074a20c5-bc97-4a4b-9f11-60c63250120a" (UID: "074a20c5-bc97-4a4b-9f11-60c63250120a"). InnerVolumeSpecName "kube-api-access-j4ph9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.720955 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-config-data" (OuterVolumeSpecName: "config-data") pod "074a20c5-bc97-4a4b-9f11-60c63250120a" (UID: "074a20c5-bc97-4a4b-9f11-60c63250120a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.723590 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "074a20c5-bc97-4a4b-9f11-60c63250120a" (UID: "074a20c5-bc97-4a4b-9f11-60c63250120a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.791263 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.791298 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.791310 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074a20c5-bc97-4a4b-9f11-60c63250120a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:20 crc kubenswrapper[4763]: I0930 13:57:20.791325 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4ph9\" (UniqueName: \"kubernetes.io/projected/074a20c5-bc97-4a4b-9f11-60c63250120a-kube-api-access-j4ph9\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.279721 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ftn4t" event={"ID":"074a20c5-bc97-4a4b-9f11-60c63250120a","Type":"ContainerDied","Data":"d561dccab5daa916cd4786ac205774d11f79c6fbdf2b993f7abf665bd35598b5"} Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.279774 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d561dccab5daa916cd4786ac205774d11f79c6fbdf2b993f7abf665bd35598b5" Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.279790 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ftn4t" Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.348328 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.348585 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.458217 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.480342 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.480614 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="12ee2a46-828b-4494-a721-5c8e3e6c4fa3" containerName="nova-scheduler-scheduler" containerID="cri-o://4a281cb7a3c3e7a37caa4b764b005d08332aabdc751dd41fbd02bcd2966542ce" gracePeriod=30 Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.513904 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.514261 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-log" containerID="cri-o://6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505" gracePeriod=30 Sep 30 13:57:21 crc kubenswrapper[4763]: I0930 13:57:21.514329 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-metadata" containerID="cri-o://3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248" gracePeriod=30 Sep 30 13:57:22 crc kubenswrapper[4763]: E0930 13:57:22.249703 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a281cb7a3c3e7a37caa4b764b005d08332aabdc751dd41fbd02bcd2966542ce" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:57:22 crc kubenswrapper[4763]: E0930 13:57:22.251363 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a281cb7a3c3e7a37caa4b764b005d08332aabdc751dd41fbd02bcd2966542ce" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:57:22 crc kubenswrapper[4763]: E0930 13:57:22.252653 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a281cb7a3c3e7a37caa4b764b005d08332aabdc751dd41fbd02bcd2966542ce" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:57:22 crc kubenswrapper[4763]: E0930 13:57:22.252691 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="12ee2a46-828b-4494-a721-5c8e3e6c4fa3" containerName="nova-scheduler-scheduler" Sep 30 13:57:22 crc kubenswrapper[4763]: I0930 13:57:22.291795 4763 generic.go:334] "Generic (PLEG): container finished" podID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerID="6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505" exitCode=143 Sep 30 13:57:22 crc kubenswrapper[4763]: I0930 13:57:22.291891 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32b6e2c8-d14f-4f03-b830-d9ef617b81f9","Type":"ContainerDied","Data":"6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505"} Sep 30 13:57:22 crc kubenswrapper[4763]: I0930 13:57:22.360822 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:57:22 crc kubenswrapper[4763]: I0930 13:57:22.360822 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:57:23 crc kubenswrapper[4763]: I0930 13:57:23.301078 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" containerName="nova-api-log" containerID="cri-o://78a30fa0cb886d8732c7e08de0fab1cf5a4d537f41b538998049c0d6511e87fe" gracePeriod=30 Sep 30 13:57:23 crc kubenswrapper[4763]: I0930 13:57:23.301237 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" containerName="nova-api-api" containerID="cri-o://56b368c578edab190400d0294d0111cac90771b3cc7d82995d2aa31c46c9525c" gracePeriod=30 Sep 30 13:57:24 crc kubenswrapper[4763]: I0930 13:57:24.317990 4763 generic.go:334] "Generic (PLEG): container finished" podID="c517f7af-3e33-4453-b727-1a1c458828d0" containerID="78a30fa0cb886d8732c7e08de0fab1cf5a4d537f41b538998049c0d6511e87fe" exitCode=143 Sep 30 13:57:24 crc kubenswrapper[4763]: I0930 13:57:24.318041 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c517f7af-3e33-4453-b727-1a1c458828d0","Type":"ContainerDied","Data":"78a30fa0cb886d8732c7e08de0fab1cf5a4d537f41b538998049c0d6511e87fe"} Sep 30 13:57:24 crc kubenswrapper[4763]: I0930 13:57:24.658794 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:35494->10.217.0.193:8775: read: connection reset by peer" Sep 30 13:57:24 crc kubenswrapper[4763]: I0930 13:57:24.658874 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:35504->10.217.0.193:8775: read: connection reset by peer" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.215357 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.334178 4763 generic.go:334] "Generic (PLEG): container finished" podID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerID="3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248" exitCode=0 Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.334224 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32b6e2c8-d14f-4f03-b830-d9ef617b81f9","Type":"ContainerDied","Data":"3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248"} Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.334232 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.334308 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32b6e2c8-d14f-4f03-b830-d9ef617b81f9","Type":"ContainerDied","Data":"af228d55e8797fc37b8e9e232ec7e7e18528d183e913dfd1bce70fb4b4818a1e"} Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.334349 4763 scope.go:117] "RemoveContainer" containerID="3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.368344 4763 scope.go:117] "RemoveContainer" containerID="6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.376313 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-combined-ca-bundle\") pod \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.376358 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-nova-metadata-tls-certs\") pod \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.376393 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-logs\") pod \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.376413 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhgtl\" (UniqueName: \"kubernetes.io/projected/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-kube-api-access-nhgtl\") pod \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.376474 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-config-data\") pod \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\" (UID: \"32b6e2c8-d14f-4f03-b830-d9ef617b81f9\") " Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.376927 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-logs" (OuterVolumeSpecName: "logs") pod "32b6e2c8-d14f-4f03-b830-d9ef617b81f9" (UID: "32b6e2c8-d14f-4f03-b830-d9ef617b81f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.393927 4763 scope.go:117] "RemoveContainer" containerID="3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.393947 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-kube-api-access-nhgtl" (OuterVolumeSpecName: "kube-api-access-nhgtl") pod "32b6e2c8-d14f-4f03-b830-d9ef617b81f9" (UID: "32b6e2c8-d14f-4f03-b830-d9ef617b81f9"). InnerVolumeSpecName "kube-api-access-nhgtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:25 crc kubenswrapper[4763]: E0930 13:57:25.398283 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248\": container with ID starting with 3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248 not found: ID does not exist" containerID="3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.398344 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248"} err="failed to get container status \"3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248\": rpc error: code = NotFound desc = could not find container \"3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248\": container with ID starting with 3cbd12d4957879f2e00ffef94fcdeaba712f5f09628c488463e4d5bd51685248 not found: ID does not exist" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.398376 4763 scope.go:117] "RemoveContainer" containerID="6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505" Sep 30 13:57:25 crc kubenswrapper[4763]: E0930 13:57:25.399172 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505\": container with ID starting with 6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505 not found: ID does not exist" containerID="6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.399313 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505"} err="failed to get container status \"6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505\": rpc error: code = NotFound desc = could not find container \"6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505\": container with ID starting with 6e1b9bfe03013649ec88f75c096435b4d20e2a3fca4d603f35b2f4b873357505 not found: ID does not exist" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.412876 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32b6e2c8-d14f-4f03-b830-d9ef617b81f9" (UID: "32b6e2c8-d14f-4f03-b830-d9ef617b81f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.415711 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-config-data" (OuterVolumeSpecName: "config-data") pod "32b6e2c8-d14f-4f03-b830-d9ef617b81f9" (UID: "32b6e2c8-d14f-4f03-b830-d9ef617b81f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.433185 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "32b6e2c8-d14f-4f03-b830-d9ef617b81f9" (UID: "32b6e2c8-d14f-4f03-b830-d9ef617b81f9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.478358 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.478391 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.478404 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.478413 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhgtl\" (UniqueName: \"kubernetes.io/projected/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-kube-api-access-nhgtl\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.478422 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b6e2c8-d14f-4f03-b830-d9ef617b81f9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.671016 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.681206 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.689041 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:57:25 crc kubenswrapper[4763]: E0930 13:57:25.689421 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-metadata" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.689437 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-metadata" Sep 30 13:57:25 crc kubenswrapper[4763]: E0930 13:57:25.689456 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db68026f-4475-4db7-a942-3ba486623fc5" containerName="dnsmasq-dns" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.689463 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="db68026f-4475-4db7-a942-3ba486623fc5" containerName="dnsmasq-dns" Sep 30 13:57:25 crc kubenswrapper[4763]: E0930 13:57:25.689487 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db68026f-4475-4db7-a942-3ba486623fc5" containerName="init" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.689494 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="db68026f-4475-4db7-a942-3ba486623fc5" containerName="init" Sep 30 13:57:25 crc kubenswrapper[4763]: E0930 13:57:25.689511 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-log" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.689517 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-log" Sep 30 13:57:25 crc kubenswrapper[4763]: E0930 13:57:25.689529 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074a20c5-bc97-4a4b-9f11-60c63250120a" containerName="nova-manage" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.689535 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="074a20c5-bc97-4a4b-9f11-60c63250120a" containerName="nova-manage" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.689762 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-metadata" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.689786 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="074a20c5-bc97-4a4b-9f11-60c63250120a" containerName="nova-manage" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.689805 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="db68026f-4475-4db7-a942-3ba486623fc5" containerName="dnsmasq-dns" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.689813 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" containerName="nova-metadata-log" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.690723 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.694900 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.695099 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.704982 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.784097 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-config-data\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.784142 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.784282 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.784302 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2dnr\" (UniqueName: \"kubernetes.io/projected/709e8d49-783d-44fb-8bcb-0b4ac2199efe-kube-api-access-c2dnr\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.784321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/709e8d49-783d-44fb-8bcb-0b4ac2199efe-logs\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.886090 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.886466 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-config-data\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.886570 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.886592 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/709e8d49-783d-44fb-8bcb-0b4ac2199efe-logs\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.886631 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2dnr\" (UniqueName: \"kubernetes.io/projected/709e8d49-783d-44fb-8bcb-0b4ac2199efe-kube-api-access-c2dnr\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.887231 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/709e8d49-783d-44fb-8bcb-0b4ac2199efe-logs\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.892180 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.892188 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.892267 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-config-data\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:25 crc kubenswrapper[4763]: I0930 13:57:25.908450 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2dnr\" (UniqueName: \"kubernetes.io/projected/709e8d49-783d-44fb-8bcb-0b4ac2199efe-kube-api-access-c2dnr\") pod \"nova-metadata-0\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " pod="openstack/nova-metadata-0" Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.007269 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.345907 4763 generic.go:334] "Generic (PLEG): container finished" podID="12ee2a46-828b-4494-a721-5c8e3e6c4fa3" containerID="4a281cb7a3c3e7a37caa4b764b005d08332aabdc751dd41fbd02bcd2966542ce" exitCode=0 Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.345983 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12ee2a46-828b-4494-a721-5c8e3e6c4fa3","Type":"ContainerDied","Data":"4a281cb7a3c3e7a37caa4b764b005d08332aabdc751dd41fbd02bcd2966542ce"} Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.460210 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.500659 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b6e2c8-d14f-4f03-b830-d9ef617b81f9" path="/var/lib/kubelet/pods/32b6e2c8-d14f-4f03-b830-d9ef617b81f9/volumes" Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.589471 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.703464 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-combined-ca-bundle\") pod \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.703635 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-config-data\") pod \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.704483 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4hm4\" (UniqueName: \"kubernetes.io/projected/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-kube-api-access-f4hm4\") pod \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\" (UID: \"12ee2a46-828b-4494-a721-5c8e3e6c4fa3\") " Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.711004 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-kube-api-access-f4hm4" (OuterVolumeSpecName: "kube-api-access-f4hm4") pod "12ee2a46-828b-4494-a721-5c8e3e6c4fa3" (UID: "12ee2a46-828b-4494-a721-5c8e3e6c4fa3"). InnerVolumeSpecName "kube-api-access-f4hm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.732450 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-config-data" (OuterVolumeSpecName: "config-data") pod "12ee2a46-828b-4494-a721-5c8e3e6c4fa3" (UID: "12ee2a46-828b-4494-a721-5c8e3e6c4fa3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.733411 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12ee2a46-828b-4494-a721-5c8e3e6c4fa3" (UID: "12ee2a46-828b-4494-a721-5c8e3e6c4fa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.807697 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.808484 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4hm4\" (UniqueName: \"kubernetes.io/projected/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-kube-api-access-f4hm4\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:26 crc kubenswrapper[4763]: I0930 13:57:26.808571 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ee2a46-828b-4494-a721-5c8e3e6c4fa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.364068 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"709e8d49-783d-44fb-8bcb-0b4ac2199efe","Type":"ContainerStarted","Data":"6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152"} Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.364429 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"709e8d49-783d-44fb-8bcb-0b4ac2199efe","Type":"ContainerStarted","Data":"80c2142c89d30f005e287aea73691de022eb648e9176905b24ee5c3927298e85"} Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.366393 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12ee2a46-828b-4494-a721-5c8e3e6c4fa3","Type":"ContainerDied","Data":"e170c7f1eb630c6eef251bbd94dfa7f20020848bff35157a21d239b41295d58d"} Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.366426 4763 scope.go:117] "RemoveContainer" containerID="4a281cb7a3c3e7a37caa4b764b005d08332aabdc751dd41fbd02bcd2966542ce" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.366511 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.412483 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.427569 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.444069 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:57:27 crc kubenswrapper[4763]: E0930 13:57:27.444458 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ee2a46-828b-4494-a721-5c8e3e6c4fa3" containerName="nova-scheduler-scheduler" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.444472 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ee2a46-828b-4494-a721-5c8e3e6c4fa3" containerName="nova-scheduler-scheduler" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.444653 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ee2a46-828b-4494-a721-5c8e3e6c4fa3" containerName="nova-scheduler-scheduler" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.445238 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.447106 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.453814 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.524002 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjwxk\" (UniqueName: \"kubernetes.io/projected/c4737b72-6133-4316-8b4e-1a7a3938cd05-kube-api-access-jjwxk\") pod \"nova-scheduler-0\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.524262 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.524431 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-config-data\") pod \"nova-scheduler-0\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.626932 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjwxk\" (UniqueName: \"kubernetes.io/projected/c4737b72-6133-4316-8b4e-1a7a3938cd05-kube-api-access-jjwxk\") pod \"nova-scheduler-0\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.627045 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.627155 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-config-data\") pod \"nova-scheduler-0\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.651852 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.651943 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-config-data\") pod \"nova-scheduler-0\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.657480 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjwxk\" (UniqueName: \"kubernetes.io/projected/c4737b72-6133-4316-8b4e-1a7a3938cd05-kube-api-access-jjwxk\") pod \"nova-scheduler-0\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " pod="openstack/nova-scheduler-0" Sep 30 13:57:27 crc kubenswrapper[4763]: I0930 13:57:27.798795 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.257262 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:57:28 crc kubenswrapper[4763]: W0930 13:57:28.261889 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4737b72_6133_4316_8b4e_1a7a3938cd05.slice/crio-7191da17e42ff72a38a1c8d86c161c95ea806ef10e01af229fefb6d267c6ebef WatchSource:0}: Error finding container 7191da17e42ff72a38a1c8d86c161c95ea806ef10e01af229fefb6d267c6ebef: Status 404 returned error can't find the container with id 7191da17e42ff72a38a1c8d86c161c95ea806ef10e01af229fefb6d267c6ebef Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.379587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c4737b72-6133-4316-8b4e-1a7a3938cd05","Type":"ContainerStarted","Data":"7191da17e42ff72a38a1c8d86c161c95ea806ef10e01af229fefb6d267c6ebef"} Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.385541 4763 generic.go:334] "Generic (PLEG): container finished" podID="c517f7af-3e33-4453-b727-1a1c458828d0" containerID="56b368c578edab190400d0294d0111cac90771b3cc7d82995d2aa31c46c9525c" exitCode=0 Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.385623 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c517f7af-3e33-4453-b727-1a1c458828d0","Type":"ContainerDied","Data":"56b368c578edab190400d0294d0111cac90771b3cc7d82995d2aa31c46c9525c"} Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.410248 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"709e8d49-783d-44fb-8bcb-0b4ac2199efe","Type":"ContainerStarted","Data":"9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5"} Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.442930 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.442913115 podStartE2EDuration="3.442913115s" podCreationTimestamp="2025-09-30 13:57:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:28.430998847 +0000 UTC m=+1320.569559132" watchObservedRunningTime="2025-09-30 13:57:28.442913115 +0000 UTC m=+1320.581473400" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.501663 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ee2a46-828b-4494-a721-5c8e3e6c4fa3" path="/var/lib/kubelet/pods/12ee2a46-828b-4494-a721-5c8e3e6c4fa3/volumes" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.561218 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.651758 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c517f7af-3e33-4453-b727-1a1c458828d0-logs\") pod \"c517f7af-3e33-4453-b727-1a1c458828d0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.651836 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-public-tls-certs\") pod \"c517f7af-3e33-4453-b727-1a1c458828d0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.651920 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-combined-ca-bundle\") pod \"c517f7af-3e33-4453-b727-1a1c458828d0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.652053 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-internal-tls-certs\") pod \"c517f7af-3e33-4453-b727-1a1c458828d0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.652185 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvrvr\" (UniqueName: \"kubernetes.io/projected/c517f7af-3e33-4453-b727-1a1c458828d0-kube-api-access-vvrvr\") pod \"c517f7af-3e33-4453-b727-1a1c458828d0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.652180 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c517f7af-3e33-4453-b727-1a1c458828d0-logs" (OuterVolumeSpecName: "logs") pod "c517f7af-3e33-4453-b727-1a1c458828d0" (UID: "c517f7af-3e33-4453-b727-1a1c458828d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.652336 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-config-data\") pod \"c517f7af-3e33-4453-b727-1a1c458828d0\" (UID: \"c517f7af-3e33-4453-b727-1a1c458828d0\") " Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.653208 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c517f7af-3e33-4453-b727-1a1c458828d0-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.657894 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c517f7af-3e33-4453-b727-1a1c458828d0-kube-api-access-vvrvr" (OuterVolumeSpecName: "kube-api-access-vvrvr") pod "c517f7af-3e33-4453-b727-1a1c458828d0" (UID: "c517f7af-3e33-4453-b727-1a1c458828d0"). InnerVolumeSpecName "kube-api-access-vvrvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.680378 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-config-data" (OuterVolumeSpecName: "config-data") pod "c517f7af-3e33-4453-b727-1a1c458828d0" (UID: "c517f7af-3e33-4453-b727-1a1c458828d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.684928 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c517f7af-3e33-4453-b727-1a1c458828d0" (UID: "c517f7af-3e33-4453-b727-1a1c458828d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.705901 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c517f7af-3e33-4453-b727-1a1c458828d0" (UID: "c517f7af-3e33-4453-b727-1a1c458828d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.707783 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c517f7af-3e33-4453-b727-1a1c458828d0" (UID: "c517f7af-3e33-4453-b727-1a1c458828d0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.755163 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.755203 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.755212 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.755224 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvrvr\" (UniqueName: \"kubernetes.io/projected/c517f7af-3e33-4453-b727-1a1c458828d0-kube-api-access-vvrvr\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:28 crc kubenswrapper[4763]: I0930 13:57:28.755238 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c517f7af-3e33-4453-b727-1a1c458828d0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.420304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c4737b72-6133-4316-8b4e-1a7a3938cd05","Type":"ContainerStarted","Data":"41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856"} Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.422508 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c517f7af-3e33-4453-b727-1a1c458828d0","Type":"ContainerDied","Data":"d3a77b7c83674a5c7069dcad1ba0f9008ca4905004a8f32015b913e71266d3f8"} Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.422866 4763 scope.go:117] "RemoveContainer" containerID="56b368c578edab190400d0294d0111cac90771b3cc7d82995d2aa31c46c9525c" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.424299 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.445883 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.445866512 podStartE2EDuration="2.445866512s" podCreationTimestamp="2025-09-30 13:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:29.438050347 +0000 UTC m=+1321.576610652" watchObservedRunningTime="2025-09-30 13:57:29.445866512 +0000 UTC m=+1321.584426787" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.453738 4763 scope.go:117] "RemoveContainer" containerID="78a30fa0cb886d8732c7e08de0fab1cf5a4d537f41b538998049c0d6511e87fe" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.463918 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.475145 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.489864 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:29 crc kubenswrapper[4763]: E0930 13:57:29.490282 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" containerName="nova-api-api" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.490304 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" containerName="nova-api-api" Sep 30 13:57:29 crc kubenswrapper[4763]: E0930 13:57:29.490333 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" containerName="nova-api-log" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.490341 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" containerName="nova-api-log" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.490681 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" containerName="nova-api-api" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.490820 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" containerName="nova-api-log" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.492029 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.496193 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.496244 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.496197 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.501159 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.567512 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-public-tls-certs\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.567551 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-logs\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.567776 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-config-data\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.567825 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.567908 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.567941 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx728\" (UniqueName: \"kubernetes.io/projected/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-kube-api-access-lx728\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.669411 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.669513 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx728\" (UniqueName: \"kubernetes.io/projected/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-kube-api-access-lx728\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.669553 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-public-tls-certs\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.669571 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-logs\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.669656 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-config-data\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.669690 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.670354 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-logs\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.675006 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.675444 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-public-tls-certs\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.678471 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-config-data\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.684540 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.687440 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx728\" (UniqueName: \"kubernetes.io/projected/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-kube-api-access-lx728\") pod \"nova-api-0\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " pod="openstack/nova-api-0" Sep 30 13:57:29 crc kubenswrapper[4763]: I0930 13:57:29.818941 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:57:32 crc kubenswrapper[4763]: I0930 13:57:30.503187 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c517f7af-3e33-4453-b727-1a1c458828d0" path="/var/lib/kubelet/pods/c517f7af-3e33-4453-b727-1a1c458828d0/volumes" Sep 30 13:57:32 crc kubenswrapper[4763]: I0930 13:57:31.007530 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 13:57:32 crc kubenswrapper[4763]: I0930 13:57:31.007584 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 13:57:32 crc kubenswrapper[4763]: I0930 13:57:32.799791 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 13:57:32 crc kubenswrapper[4763]: I0930 13:57:32.958228 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:57:33 crc kubenswrapper[4763]: I0930 13:57:33.461531 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6002d74d-668d-4f30-b13a-c87ec6a8a3b8","Type":"ContainerStarted","Data":"963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1"} Sep 30 13:57:33 crc kubenswrapper[4763]: I0930 13:57:33.461932 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6002d74d-668d-4f30-b13a-c87ec6a8a3b8","Type":"ContainerStarted","Data":"d369c3078420273587e74db874b80398e8211b88c5743d42c0c7a636c9f08c8b"} Sep 30 13:57:34 crc kubenswrapper[4763]: I0930 13:57:34.473880 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6002d74d-668d-4f30-b13a-c87ec6a8a3b8","Type":"ContainerStarted","Data":"15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7"} Sep 30 13:57:34 crc kubenswrapper[4763]: I0930 13:57:34.517367 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.5173511 podStartE2EDuration="5.5173511s" podCreationTimestamp="2025-09-30 13:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:34.498733773 +0000 UTC m=+1326.637294058" watchObservedRunningTime="2025-09-30 13:57:34.5173511 +0000 UTC m=+1326.655911385" Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.007799 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.007852 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.060317 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.060377 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.060423 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.061116 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93b97d46ec993310482c9f94e284fd8475a6addbce7a122971ed13904ff04071"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.061173 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://93b97d46ec993310482c9f94e284fd8475a6addbce7a122971ed13904ff04071" gracePeriod=600 Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.502574 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="93b97d46ec993310482c9f94e284fd8475a6addbce7a122971ed13904ff04071" exitCode=0 Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.502639 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"93b97d46ec993310482c9f94e284fd8475a6addbce7a122971ed13904ff04071"} Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.502971 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3"} Sep 30 13:57:36 crc kubenswrapper[4763]: I0930 13:57:36.502996 4763 scope.go:117] "RemoveContainer" containerID="929286b0798b4123a28e4fd7afc0d057a5a3facafe7726db3c5285288ca63279" Sep 30 13:57:37 crc kubenswrapper[4763]: I0930 13:57:37.021854 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:57:37 crc kubenswrapper[4763]: I0930 13:57:37.022004 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:57:37 crc kubenswrapper[4763]: I0930 13:57:37.800079 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 13:57:37 crc kubenswrapper[4763]: I0930 13:57:37.833883 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 13:57:38 crc kubenswrapper[4763]: I0930 13:57:38.550446 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 13:57:39 crc kubenswrapper[4763]: I0930 13:57:39.745474 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 13:57:39 crc kubenswrapper[4763]: I0930 13:57:39.819234 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:57:39 crc kubenswrapper[4763]: I0930 13:57:39.819297 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:57:40 crc kubenswrapper[4763]: I0930 13:57:40.833760 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:57:40 crc kubenswrapper[4763]: I0930 13:57:40.833832 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:57:46 crc kubenswrapper[4763]: I0930 13:57:46.014682 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 13:57:46 crc kubenswrapper[4763]: I0930 13:57:46.015230 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 13:57:46 crc kubenswrapper[4763]: I0930 13:57:46.020209 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 13:57:46 crc kubenswrapper[4763]: I0930 13:57:46.022798 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 13:57:49 crc kubenswrapper[4763]: I0930 13:57:49.824834 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 13:57:49 crc kubenswrapper[4763]: I0930 13:57:49.825850 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 13:57:49 crc kubenswrapper[4763]: I0930 13:57:49.828050 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 13:57:49 crc kubenswrapper[4763]: I0930 13:57:49.834166 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 13:57:50 crc kubenswrapper[4763]: I0930 13:57:50.648573 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 13:57:50 crc kubenswrapper[4763]: I0930 13:57:50.655298 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.321415 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.322080 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="98e98c9d-b727-4c5b-857b-13064b0ef92f" containerName="openstackclient" containerID="cri-o://e4eaf436f4bb9a039d79a107f25f1b38d03cc925e22d803d54fdd98495213540" gracePeriod=2 Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.337571 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.581680 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 13:58:08 crc kubenswrapper[4763]: E0930 13:58:08.766972 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Sep 30 13:58:08 crc kubenswrapper[4763]: E0930 13:58:08.767516 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data podName:aebd5213-18eb-4d84-b39e-fd22f9ff9a6c nodeName:}" failed. No retries permitted until 2025-09-30 13:58:09.26745912 +0000 UTC m=+1361.406019405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data") pod "rabbitmq-cell1-server-0" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c") : configmap "rabbitmq-cell1-config-data" not found Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.827834 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance9e56-account-delete-4pzvl"] Sep 30 13:58:08 crc kubenswrapper[4763]: E0930 13:58:08.828624 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e98c9d-b727-4c5b-857b-13064b0ef92f" containerName="openstackclient" Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.828648 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e98c9d-b727-4c5b-857b-13064b0ef92f" containerName="openstackclient" Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.828914 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e98c9d-b727-4c5b-857b-13064b0ef92f" containerName="openstackclient" Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.829799 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance9e56-account-delete-4pzvl" Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.892676 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.911683 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance9e56-account-delete-4pzvl"] Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.939327 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.939596 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" containerName="ovn-northd" containerID="cri-o://e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" gracePeriod=30 Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.939755 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" containerName="openstack-network-exporter" containerID="cri-o://777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b" gracePeriod=30 Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.967926 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutrona40b-account-delete-z82vw"] Sep 30 13:58:08 crc kubenswrapper[4763]: I0930 13:58:08.969291 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutrona40b-account-delete-z82vw" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.037532 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutrona40b-account-delete-z82vw"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.079111 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementb89d-account-delete-df5j7"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.080480 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementb89d-account-delete-df5j7" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.082497 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvbqc\" (UniqueName: \"kubernetes.io/projected/fcb2a96e-6374-4a22-a7fd-058bfdefac42-kube-api-access-kvbqc\") pod \"neutrona40b-account-delete-z82vw\" (UID: \"fcb2a96e-6374-4a22-a7fd-058bfdefac42\") " pod="openstack/neutrona40b-account-delete-z82vw" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.082727 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4pwj\" (UniqueName: \"kubernetes.io/projected/7550cde2-d6ca-4dc1-8772-5eea0a9b8142-kube-api-access-v4pwj\") pod \"placementb89d-account-delete-df5j7\" (UID: \"7550cde2-d6ca-4dc1-8772-5eea0a9b8142\") " pod="openstack/placementb89d-account-delete-df5j7" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.082800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qtqv\" (UniqueName: \"kubernetes.io/projected/cf9f1fd7-72d5-4f11-b8c8-5e941597ca75-kube-api-access-6qtqv\") pod \"glance9e56-account-delete-4pzvl\" (UID: \"cf9f1fd7-72d5-4f11-b8c8-5e941597ca75\") " pod="openstack/glance9e56-account-delete-4pzvl" Sep 30 13:58:09 crc kubenswrapper[4763]: E0930 13:58:09.095651 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Sep 30 13:58:09 crc kubenswrapper[4763]: E0930 13:58:09.095700 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data podName:3119638a-6580-4a24-8e7f-40f7f7d788a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:09.595686365 +0000 UTC m=+1361.734246650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data") pod "rabbitmq-server-0" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5") : configmap "rabbitmq-config-data" not found Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.184272 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementb89d-account-delete-df5j7"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.185322 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4pwj\" (UniqueName: \"kubernetes.io/projected/7550cde2-d6ca-4dc1-8772-5eea0a9b8142-kube-api-access-v4pwj\") pod \"placementb89d-account-delete-df5j7\" (UID: \"7550cde2-d6ca-4dc1-8772-5eea0a9b8142\") " pod="openstack/placementb89d-account-delete-df5j7" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.185376 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qtqv\" (UniqueName: \"kubernetes.io/projected/cf9f1fd7-72d5-4f11-b8c8-5e941597ca75-kube-api-access-6qtqv\") pod \"glance9e56-account-delete-4pzvl\" (UID: \"cf9f1fd7-72d5-4f11-b8c8-5e941597ca75\") " pod="openstack/glance9e56-account-delete-4pzvl" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.185410 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvbqc\" (UniqueName: \"kubernetes.io/projected/fcb2a96e-6374-4a22-a7fd-058bfdefac42-kube-api-access-kvbqc\") pod \"neutrona40b-account-delete-z82vw\" (UID: \"fcb2a96e-6374-4a22-a7fd-058bfdefac42\") " pod="openstack/neutrona40b-account-delete-z82vw" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.230461 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qtqv\" (UniqueName: \"kubernetes.io/projected/cf9f1fd7-72d5-4f11-b8c8-5e941597ca75-kube-api-access-6qtqv\") pod \"glance9e56-account-delete-4pzvl\" (UID: \"cf9f1fd7-72d5-4f11-b8c8-5e941597ca75\") " pod="openstack/glance9e56-account-delete-4pzvl" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.253488 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvbqc\" (UniqueName: \"kubernetes.io/projected/fcb2a96e-6374-4a22-a7fd-058bfdefac42-kube-api-access-kvbqc\") pod \"neutrona40b-account-delete-z82vw\" (UID: \"fcb2a96e-6374-4a22-a7fd-058bfdefac42\") " pod="openstack/neutrona40b-account-delete-z82vw" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.265447 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4pwj\" (UniqueName: \"kubernetes.io/projected/7550cde2-d6ca-4dc1-8772-5eea0a9b8142-kube-api-access-v4pwj\") pod \"placementb89d-account-delete-df5j7\" (UID: \"7550cde2-d6ca-4dc1-8772-5eea0a9b8142\") " pod="openstack/placementb89d-account-delete-df5j7" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.276373 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-q6kdz"] Sep 30 13:58:09 crc kubenswrapper[4763]: E0930 13:58:09.299017 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Sep 30 13:58:09 crc kubenswrapper[4763]: E0930 13:58:09.299085 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data podName:aebd5213-18eb-4d84-b39e-fd22f9ff9a6c nodeName:}" failed. No retries permitted until 2025-09-30 13:58:10.299068183 +0000 UTC m=+1362.437628468 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data") pod "rabbitmq-cell1-server-0" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c") : configmap "rabbitmq-cell1-config-data" not found Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.310277 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutrona40b-account-delete-z82vw" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.348079 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-q6kdz"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.414015 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder97e0-account-delete-988h9"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.415634 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder97e0-account-delete-988h9" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.450388 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder97e0-account-delete-988h9"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.464446 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-xmvws"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.467925 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance9e56-account-delete-4pzvl" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.509238 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementb89d-account-delete-df5j7" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.548845 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-r6p2n"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.585676 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-xmvws"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.607538 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hbhc\" (UniqueName: \"kubernetes.io/projected/ae274648-abe2-416e-a43d-edc836edc424-kube-api-access-7hbhc\") pod \"cinder97e0-account-delete-988h9\" (UID: \"ae274648-abe2-416e-a43d-edc836edc424\") " pod="openstack/cinder97e0-account-delete-988h9" Sep 30 13:58:09 crc kubenswrapper[4763]: E0930 13:58:09.616089 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Sep 30 13:58:09 crc kubenswrapper[4763]: E0930 13:58:09.616136 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data podName:3119638a-6580-4a24-8e7f-40f7f7d788a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:10.61612183 +0000 UTC m=+1362.754682115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data") pod "rabbitmq-server-0" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5") : configmap "rabbitmq-config-data" not found Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.709446 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hbhc\" (UniqueName: \"kubernetes.io/projected/ae274648-abe2-416e-a43d-edc836edc424-kube-api-access-7hbhc\") pod \"cinder97e0-account-delete-988h9\" (UID: \"ae274648-abe2-416e-a43d-edc836edc424\") " pod="openstack/cinder97e0-account-delete-988h9" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.734848 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-r6p2n"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.769112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hbhc\" (UniqueName: \"kubernetes.io/projected/ae274648-abe2-416e-a43d-edc836edc424-kube-api-access-7hbhc\") pod \"cinder97e0-account-delete-988h9\" (UID: \"ae274648-abe2-416e-a43d-edc836edc424\") " pod="openstack/cinder97e0-account-delete-988h9" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.801398 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.847613 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.847928 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerName="openstack-network-exporter" containerID="cri-o://823878f5a22f30c2add397afa8ccc5fd623a8d47193a01552f23a33548ef8021" gracePeriod=300 Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.856259 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder97e0-account-delete-988h9" Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.891961 4763 generic.go:334] "Generic (PLEG): container finished" podID="916727b2-6488-4edf-b33b-c5908eae0e41" containerID="777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b" exitCode=2 Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.892013 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"916727b2-6488-4edf-b33b-c5908eae0e41","Type":"ContainerDied","Data":"777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b"} Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.926217 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 13:58:09 crc kubenswrapper[4763]: I0930 13:58:09.960007 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" containerName="rabbitmq" containerID="cri-o://28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9" gracePeriod=604800 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.010256 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican41ee-account-delete-q7t27"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.012792 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican41ee-account-delete-q7t27" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.027247 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3119638a-6580-4a24-8e7f-40f7f7d788a5" containerName="rabbitmq" containerID="cri-o://c5af37dfd26586dfbe5d5f60114f298ea522d4e3bbbc87c8e965efa23a5cf953" gracePeriod=604800 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.062642 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican41ee-account-delete-q7t27"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.121408 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-zjgmb"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.128269 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-zjgmb"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.131358 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmjcp\" (UniqueName: \"kubernetes.io/projected/4a588d68-fc19-4242-9b61-0ed79678fc9e-kube-api-access-bmjcp\") pod \"barbican41ee-account-delete-q7t27\" (UID: \"4a588d68-fc19-4242-9b61-0ed79678fc9e\") " pod="openstack/barbican41ee-account-delete-q7t27" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.138037 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0903f-account-delete-2hlbl"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.139304 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0903f-account-delete-2hlbl" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.177470 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0903f-account-delete-2hlbl"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.233190 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapibbaf-account-delete-rr8rm"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.240568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmjcp\" (UniqueName: \"kubernetes.io/projected/4a588d68-fc19-4242-9b61-0ed79678fc9e-kube-api-access-bmjcp\") pod \"barbican41ee-account-delete-q7t27\" (UID: \"4a588d68-fc19-4242-9b61-0ed79678fc9e\") " pod="openstack/barbican41ee-account-delete-q7t27" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.240700 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhq6h\" (UniqueName: \"kubernetes.io/projected/7b9bf16b-039c-46ba-ae41-f0622530202d-kube-api-access-lhq6h\") pod \"novacell0903f-account-delete-2hlbl\" (UID: \"7b9bf16b-039c-46ba-ae41-f0622530202d\") " pod="openstack/novacell0903f-account-delete-2hlbl" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.250012 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-72z5c"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.250147 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibbaf-account-delete-rr8rm" Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.255143 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.267288 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kwz5v"] Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.271344 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.279265 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.279322 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" containerName="ovn-northd" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.280915 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapibbaf-account-delete-rr8rm"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.291940 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerName="ovsdbserver-nb" containerID="cri-o://780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593" gracePeriod=300 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.295718 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-djfwj"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.295965 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-djfwj" podUID="03f3de76-2dd7-4d26-8010-72d5ff408190" containerName="openstack-network-exporter" containerID="cri-o://8a278d3c4ace4c4d3804e473924d1b56cd571d2b8cdd048d77caed140d79f478" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.309044 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-nps2w"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.331081 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-nps2w"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.332291 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmjcp\" (UniqueName: \"kubernetes.io/projected/4a588d68-fc19-4242-9b61-0ed79678fc9e-kube-api-access-bmjcp\") pod \"barbican41ee-account-delete-q7t27\" (UID: \"4a588d68-fc19-4242-9b61-0ed79678fc9e\") " pod="openstack/barbican41ee-account-delete-q7t27" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.367884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzb6c\" (UniqueName: \"kubernetes.io/projected/0e45a139-0079-45cc-89a9-b1a0b0c1d179-kube-api-access-pzb6c\") pod \"novaapibbaf-account-delete-rr8rm\" (UID: \"0e45a139-0079-45cc-89a9-b1a0b0c1d179\") " pod="openstack/novaapibbaf-account-delete-rr8rm" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.367985 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhq6h\" (UniqueName: \"kubernetes.io/projected/7b9bf16b-039c-46ba-ae41-f0622530202d-kube-api-access-lhq6h\") pod \"novacell0903f-account-delete-2hlbl\" (UID: \"7b9bf16b-039c-46ba-ae41-f0622530202d\") " pod="openstack/novacell0903f-account-delete-2hlbl" Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.368318 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593" cmd=["/usr/bin/pidof","ovsdb-server"] Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.368465 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.368505 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data podName:aebd5213-18eb-4d84-b39e-fd22f9ff9a6c nodeName:}" failed. No retries permitted until 2025-09-30 13:58:12.368491242 +0000 UTC m=+1364.507051527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data") pod "rabbitmq-cell1-server-0" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c") : configmap "rabbitmq-cell1-config-data" not found Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.370055 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-sn928"] Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.387404 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593 is running failed: container process not found" containerID="780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593" cmd=["/usr/bin/pidof","ovsdb-server"] Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.397917 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593 is running failed: container process not found" containerID="780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593" cmd=["/usr/bin/pidof","ovsdb-server"] Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.397978 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerName="ovsdbserver-nb" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.448023 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhq6h\" (UniqueName: \"kubernetes.io/projected/7b9bf16b-039c-46ba-ae41-f0622530202d-kube-api-access-lhq6h\") pod \"novacell0903f-account-delete-2hlbl\" (UID: \"7b9bf16b-039c-46ba-ae41-f0622530202d\") " pod="openstack/novacell0903f-account-delete-2hlbl" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.470427 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzb6c\" (UniqueName: \"kubernetes.io/projected/0e45a139-0079-45cc-89a9-b1a0b0c1d179-kube-api-access-pzb6c\") pod \"novaapibbaf-account-delete-rr8rm\" (UID: \"0e45a139-0079-45cc-89a9-b1a0b0c1d179\") " pod="openstack/novaapibbaf-account-delete-rr8rm" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.531457 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56447315-e00d-4a65-9ee4-c58432d2ebca" path="/var/lib/kubelet/pods/56447315-e00d-4a65-9ee4-c58432d2ebca/volumes" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.532385 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3f0264-cce9-436f-923d-79f807488437" path="/var/lib/kubelet/pods/7c3f0264-cce9-436f-923d-79f807488437/volumes" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.532927 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96adbfe1-e6f8-4460-b999-a213cb396c4b" path="/var/lib/kubelet/pods/96adbfe1-e6f8-4460-b999-a213cb396c4b/volumes" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.533960 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbda58bd-f991-4d2d-ba12-f3945505afa6" path="/var/lib/kubelet/pods/cbda58bd-f991-4d2d-ba12-f3945505afa6/volumes" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.534500 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e000c274-a7a0-493f-a0ea-537e5c474cb0" path="/var/lib/kubelet/pods/e000c274-a7a0-493f-a0ea-537e5c474cb0/volumes" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.535356 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-sn928"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.536824 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzb6c\" (UniqueName: \"kubernetes.io/projected/0e45a139-0079-45cc-89a9-b1a0b0c1d179-kube-api-access-pzb6c\") pod \"novaapibbaf-account-delete-rr8rm\" (UID: \"0e45a139-0079-45cc-89a9-b1a0b0c1d179\") " pod="openstack/novaapibbaf-account-delete-rr8rm" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.566659 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-ktclz"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.566947 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" podUID="2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" containerName="dnsmasq-dns" containerID="cri-o://f18e050afde37900d0b00f1f42394f96b83e1b630126fc3ff1f6312b776bc3ae" gracePeriod=10 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.632725 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ftn4t"] Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.676862 4763 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-kwz5v" message="Exiting ovn-controller (1) " Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.676900 4763 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-kwz5v" podUID="1db73295-0655-443c-91e0-2cd08b119141" containerName="ovn-controller" containerID="cri-o://6929f8af8d3cf797dc4b407e18a7a6d4c22dc654105d94f4fa1d84446a16b519" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.676931 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-kwz5v" podUID="1db73295-0655-443c-91e0-2cd08b119141" containerName="ovn-controller" containerID="cri-o://6929f8af8d3cf797dc4b407e18a7a6d4c22dc654105d94f4fa1d84446a16b519" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.677215 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Sep 30 13:58:10 crc kubenswrapper[4763]: E0930 13:58:10.677255 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data podName:3119638a-6580-4a24-8e7f-40f7f7d788a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:12.677239931 +0000 UTC m=+1364.815800216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data") pod "rabbitmq-server-0" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5") : configmap "rabbitmq-config-data" not found Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.708137 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ftn4t"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.866926 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75bcdb8fc9-ml4n8"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.867220 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75bcdb8fc9-ml4n8" podUID="02c33b2c-ca4f-45a8-9920-63df9fc79108" containerName="neutron-api" containerID="cri-o://e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.867717 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75bcdb8fc9-ml4n8" podUID="02c33b2c-ca4f-45a8-9920-63df9fc79108" containerName="neutron-httpd" containerID="cri-o://d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.903447 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.903996 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="b611e133-1d4a-49a8-9632-bdb825d41fa4" containerName="openstack-network-exporter" containerID="cri-o://ecb99a945c2d9b0bf36ec0c4004dd06419870d85ada8f6a288f0b829f450fa4d" gracePeriod=300 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.926231 4763 generic.go:334] "Generic (PLEG): container finished" podID="1db73295-0655-443c-91e0-2cd08b119141" containerID="6929f8af8d3cf797dc4b407e18a7a6d4c22dc654105d94f4fa1d84446a16b519" exitCode=0 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.926382 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5b87bfdd4b-tbjxc"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.926413 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kwz5v" event={"ID":"1db73295-0655-443c-91e0-2cd08b119141","Type":"ContainerDied","Data":"6929f8af8d3cf797dc4b407e18a7a6d4c22dc654105d94f4fa1d84446a16b519"} Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.926673 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5b87bfdd4b-tbjxc" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerName="placement-log" containerID="cri-o://0c4968490ce8a08e1aec5c2072537900212ab6566f70cf01898816dd71f1b15c" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.927105 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5b87bfdd4b-tbjxc" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerName="placement-api" containerID="cri-o://6e00eb474337eb85a3ae6ce678a0a8afddc2bad42ef7bdbf41de0b427ce3b086" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.941924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutrona40b-account-delete-z82vw" event={"ID":"fcb2a96e-6374-4a22-a7fd-058bfdefac42","Type":"ContainerStarted","Data":"cfd4f851da045307bf1d532bed1c20be921d4976f86033d85efe8344a18298b1"} Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.944541 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-djfwj_03f3de76-2dd7-4d26-8010-72d5ff408190/openstack-network-exporter/0.log" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.944629 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-djfwj" event={"ID":"03f3de76-2dd7-4d26-8010-72d5ff408190","Type":"ContainerDied","Data":"8a278d3c4ace4c4d3804e473924d1b56cd571d2b8cdd048d77caed140d79f478"} Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.944641 4763 generic.go:334] "Generic (PLEG): container finished" podID="03f3de76-2dd7-4d26-8010-72d5ff408190" containerID="8a278d3c4ace4c4d3804e473924d1b56cd571d2b8cdd048d77caed140d79f478" exitCode=2 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.968369 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutrona40b-account-delete-z82vw"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.969008 4763 generic.go:334] "Generic (PLEG): container finished" podID="2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" containerID="f18e050afde37900d0b00f1f42394f96b83e1b630126fc3ff1f6312b776bc3ae" exitCode=0 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.969070 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" event={"ID":"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3","Type":"ContainerDied","Data":"f18e050afde37900d0b00f1f42394f96b83e1b630126fc3ff1f6312b776bc3ae"} Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.990762 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f3cc8ad7-1903-4a9f-94a4-a84f47cd1189/ovsdbserver-nb/0.log" Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.991012 4763 generic.go:334] "Generic (PLEG): container finished" podID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerID="823878f5a22f30c2add397afa8ccc5fd623a8d47193a01552f23a33548ef8021" exitCode=2 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.991045 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189","Type":"ContainerDied","Data":"823878f5a22f30c2add397afa8ccc5fd623a8d47193a01552f23a33548ef8021"} Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.993259 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994150 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-server" containerID="cri-o://dfe4428a4ee91686c8b839dc094b2cea3d884fe055392d209003e50ad9cecb05" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994297 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-server" containerID="cri-o://da7282808861470139cef025a99057cbd65aa13cb0bfc0317356866852f5d03d" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994680 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="swift-recon-cron" containerID="cri-o://c52de0c97e78063fc806d8831c3e0f7eba864de7670f488d153c3e4e13e7df72" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994723 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="rsync" containerID="cri-o://a2c552587c9daa3eff2f6b01626bdb8930edcce9d10cefa5e6f2138456bab7ae" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994756 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-expirer" containerID="cri-o://2e13d0b0d4911e364ee6a3df6a55c9fe084a5532f8df7d0fcfa8239cfa1bd7d8" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994783 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-updater" containerID="cri-o://68319e480d02549a9670870fb2b799e7a229e796a6e2e64c34a0f931f5c2294a" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994819 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-auditor" containerID="cri-o://242dc53e835ef062c4e6ffb487f5cb2cd09de49af6b5aef18aae943dfe19b104" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994859 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-replicator" containerID="cri-o://7af9dffe8b6aec08e0ffc071adb335564cdf7ec832db594c4069392c84f63460" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994912 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-server" containerID="cri-o://e59cd93a64db4c4a2e52fd8dec840f2e642f02a5898d661bf7aeab73f09ef3a3" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994948 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-updater" containerID="cri-o://580d3253de72fffc16d0a36d6429d3d8a5a8907a3681e3d5a00508e74a43aeff" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.994975 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-auditor" containerID="cri-o://86fa7c116448649efc303d132f55a3b4d51ce4ff7728e8cb83a546ae8cc6be04" gracePeriod=30 Sep 30 13:58:10 crc kubenswrapper[4763]: I0930 13:58:10.995001 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-replicator" containerID="cri-o://17bbda96e72abf0e4fc5b512a5a8c030ec54be5d2e9697b12e425013fc6e5674" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:10.995032 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-auditor" containerID="cri-o://1bb4e132326be55cfb6d2c02cfd640df1ebca518cc39286f9fe76a41c347dda4" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:10.995074 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-reaper" containerID="cri-o://6913f1a8c201da716c04a9052d361c35e1f0beafd7a800065007dd41db8b8e2f" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:10.995121 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-replicator" containerID="cri-o://18e1cb42d1ac47e256e5579a10290ba641e04de49adb3a3798799607a90f1b1a" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.003497 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189","Type":"ContainerDied","Data":"780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593"} Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.003552 4763 generic.go:334] "Generic (PLEG): container finished" podID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerID="780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593" exitCode=143 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.013703 4763 generic.go:334] "Generic (PLEG): container finished" podID="98e98c9d-b727-4c5b-857b-13064b0ef92f" containerID="e4eaf436f4bb9a039d79a107f25f1b38d03cc925e22d803d54fdd98495213540" exitCode=137 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.017285 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="b611e133-1d4a-49a8-9632-bdb825d41fa4" containerName="ovsdbserver-sb" containerID="cri-o://f076032ba256059553984a2d073b2dcc74aadf98fe54ecddda41aaee3f716c6e" gracePeriod=300 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.040393 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.040781 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ba159e27-7a3b-4b90-a7db-de6135f8153c" containerName="glance-log" containerID="cri-o://1b8f0928a35a4ae56d6ef0cb85281920dd3c4a313f493d020961d40d139fa47b" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.041323 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ba159e27-7a3b-4b90-a7db-de6135f8153c" containerName="glance-httpd" containerID="cri-o://cb40ecb42c9c7e13873d46e9437c88735cec11180b4edf02393f88c403b8189b" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.083508 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rjt5b"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.083728 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-rjt5b"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.098730 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.099118 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8bb91013-85e0-4a13-9a06-0608b16a147b" containerName="cinder-scheduler" containerID="cri-o://cba654c3201589fdfeff007899f0f73ab1e63e34fc4ccb4f54ba1464e5755d0a" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.104047 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8bb91013-85e0-4a13-9a06-0608b16a147b" containerName="probe" containerID="cri-o://e1f5778fa17d7cddfefc9145f7fd206fb41d3fa0e3cf06f8bd8e12eb8a451d1b" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.135775 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.136086 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ce55d11a-887c-46e6-af05-90c3fca01e75" containerName="glance-log" containerID="cri-o://3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.136651 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ce55d11a-887c-46e6-af05-90c3fca01e75" containerName="glance-httpd" containerID="cri-o://26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.144528 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lz7ln"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.156322 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lz7ln"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.162837 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovs-vswitchd" containerID="cri-o://0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.166398 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.167046 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerName="cinder-api-log" containerID="cri-o://b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.167489 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerName="cinder-api" containerID="cri-o://ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.182464 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9e56-account-create-dxz6j"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.195489 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance9e56-account-delete-4pzvl"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.207517 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9e56-account-create-dxz6j"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.287616 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zjwcr"] Sep 30 13:58:11 crc kubenswrapper[4763]: E0930 13:58:11.295148 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ef00d68_6c21_4ee7_8be8_53f7c1edb2f3.slice/crio-conmon-f18e050afde37900d0b00f1f42394f96b83e1b630126fc3ff1f6312b776bc3ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03f3de76_2dd7_4d26_8010_72d5ff408190.slice/crio-conmon-8a278d3c4ace4c4d3804e473924d1b56cd571d2b8cdd048d77caed140d79f478.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02c33b2c_ca4f_45a8_9920_63df9fc79108.slice/crio-d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15fbd312_35ac_4e62_ad60_ffccf94eab4a.slice/crio-conmon-0c4968490ce8a08e1aec5c2072537900212ab6566f70cf01898816dd71f1b15c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba72f8d4_1822_4bb5_a099_c15d4b00b701.slice/crio-conmon-242dc53e835ef062c4e6ffb487f5cb2cd09de49af6b5aef18aae943dfe19b104.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1db73295_0655_443c_91e0_2cd08b119141.slice/crio-conmon-6929f8af8d3cf797dc4b407e18a7a6d4c22dc654105d94f4fa1d84446a16b519.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba72f8d4_1822_4bb5_a099_c15d4b00b701.slice/crio-conmon-580d3253de72fffc16d0a36d6429d3d8a5a8907a3681e3d5a00508e74a43aeff.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e98c9d_b727_4c5b_857b_13064b0ef92f.slice/crio-conmon-e4eaf436f4bb9a039d79a107f25f1b38d03cc925e22d803d54fdd98495213540.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1db73295_0655_443c_91e0_2cd08b119141.slice/crio-6929f8af8d3cf797dc4b407e18a7a6d4c22dc654105d94f4fa1d84446a16b519.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba72f8d4_1822_4bb5_a099_c15d4b00b701.slice/crio-17bbda96e72abf0e4fc5b512a5a8c030ec54be5d2e9697b12e425013fc6e5674.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e98c9d_b727_4c5b_857b_13064b0ef92f.slice/crio-e4eaf436f4bb9a039d79a107f25f1b38d03cc925e22d803d54fdd98495213540.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba72f8d4_1822_4bb5_a099_c15d4b00b701.slice/crio-conmon-18e1cb42d1ac47e256e5579a10290ba641e04de49adb3a3798799607a90f1b1a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba72f8d4_1822_4bb5_a099_c15d4b00b701.slice/crio-580d3253de72fffc16d0a36d6429d3d8a5a8907a3681e3d5a00508e74a43aeff.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb611e133_1d4a_49a8_9632_bdb825d41fa4.slice/crio-f076032ba256059553984a2d073b2dcc74aadf98fe54ecddda41aaee3f716c6e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15fbd312_35ac_4e62_ad60_ffccf94eab4a.slice/crio-0c4968490ce8a08e1aec5c2072537900212ab6566f70cf01898816dd71f1b15c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03f3de76_2dd7_4d26_8010_72d5ff408190.slice/crio-8a278d3c4ace4c4d3804e473924d1b56cd571d2b8cdd048d77caed140d79f478.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba159e27_7a3b_4b90_a7db_de6135f8153c.slice/crio-conmon-1b8f0928a35a4ae56d6ef0cb85281920dd3c4a313f493d020961d40d139fa47b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba72f8d4_1822_4bb5_a099_c15d4b00b701.slice/crio-18e1cb42d1ac47e256e5579a10290ba641e04de49adb3a3798799607a90f1b1a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba72f8d4_1822_4bb5_a099_c15d4b00b701.slice/crio-6913f1a8c201da716c04a9052d361c35e1f0beafd7a800065007dd41db8b8e2f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ef00d68_6c21_4ee7_8be8_53f7c1edb2f3.slice/crio-f18e050afde37900d0b00f1f42394f96b83e1b630126fc3ff1f6312b776bc3ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3cc8ad7_1903_4a9f_94a4_a84f47cd1189.slice/crio-conmon-780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba72f8d4_1822_4bb5_a099_c15d4b00b701.slice/crio-7af9dffe8b6aec08e0ffc071adb335564cdf7ec832db594c4069392c84f63460.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.332581 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zjwcr"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.342587 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican41ee-account-delete-q7t27" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.365313 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a40b-account-create-kpccm"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.441774 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a40b-account-create-kpccm"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.466760 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutrona40b-account-delete-z82vw"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.487766 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.488154 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerName="nova-api-log" containerID="cri-o://963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.488414 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerName="nova-api-api" containerID="cri-o://15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.507205 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 13:58:11 crc kubenswrapper[4763]: E0930 13:58:11.515248 4763 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Sep 30 13:58:11 crc kubenswrapper[4763]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Sep 30 13:58:11 crc kubenswrapper[4763]: + source /usr/local/bin/container-scripts/functions Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNBridge=br-int Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNRemote=tcp:localhost:6642 Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNEncapType=geneve Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNAvailabilityZones= Sep 30 13:58:11 crc kubenswrapper[4763]: ++ EnableChassisAsGateway=true Sep 30 13:58:11 crc kubenswrapper[4763]: ++ PhysicalNetworks= Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNHostName= Sep 30 13:58:11 crc kubenswrapper[4763]: ++ DB_FILE=/etc/openvswitch/conf.db Sep 30 13:58:11 crc kubenswrapper[4763]: ++ ovs_dir=/var/lib/openvswitch Sep 30 13:58:11 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Sep 30 13:58:11 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Sep 30 13:58:11 crc kubenswrapper[4763]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 30 13:58:11 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 30 13:58:11 crc kubenswrapper[4763]: + sleep 0.5 Sep 30 13:58:11 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 30 13:58:11 crc kubenswrapper[4763]: + sleep 0.5 Sep 30 13:58:11 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 30 13:58:11 crc kubenswrapper[4763]: + cleanup_ovsdb_server_semaphore Sep 30 13:58:11 crc kubenswrapper[4763]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 30 13:58:11 crc kubenswrapper[4763]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Sep 30 13:58:11 crc kubenswrapper[4763]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-72z5c" message=< Sep 30 13:58:11 crc kubenswrapper[4763]: Exiting ovsdb-server (5) [ OK ] Sep 30 13:58:11 crc kubenswrapper[4763]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Sep 30 13:58:11 crc kubenswrapper[4763]: + source /usr/local/bin/container-scripts/functions Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNBridge=br-int Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNRemote=tcp:localhost:6642 Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNEncapType=geneve Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNAvailabilityZones= Sep 30 13:58:11 crc kubenswrapper[4763]: ++ EnableChassisAsGateway=true Sep 30 13:58:11 crc kubenswrapper[4763]: ++ PhysicalNetworks= Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNHostName= Sep 30 13:58:11 crc kubenswrapper[4763]: ++ DB_FILE=/etc/openvswitch/conf.db Sep 30 13:58:11 crc kubenswrapper[4763]: ++ ovs_dir=/var/lib/openvswitch Sep 30 13:58:11 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Sep 30 13:58:11 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Sep 30 13:58:11 crc kubenswrapper[4763]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 30 13:58:11 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 30 13:58:11 crc kubenswrapper[4763]: + sleep 0.5 Sep 30 13:58:11 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 30 13:58:11 crc kubenswrapper[4763]: + sleep 0.5 Sep 30 13:58:11 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 30 13:58:11 crc kubenswrapper[4763]: + cleanup_ovsdb_server_semaphore Sep 30 13:58:11 crc kubenswrapper[4763]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 30 13:58:11 crc kubenswrapper[4763]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Sep 30 13:58:11 crc kubenswrapper[4763]: > Sep 30 13:58:11 crc kubenswrapper[4763]: E0930 13:58:11.515312 4763 kuberuntime_container.go:691] "PreStop hook failed" err=< Sep 30 13:58:11 crc kubenswrapper[4763]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Sep 30 13:58:11 crc kubenswrapper[4763]: + source /usr/local/bin/container-scripts/functions Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNBridge=br-int Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNRemote=tcp:localhost:6642 Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNEncapType=geneve Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNAvailabilityZones= Sep 30 13:58:11 crc kubenswrapper[4763]: ++ EnableChassisAsGateway=true Sep 30 13:58:11 crc kubenswrapper[4763]: ++ PhysicalNetworks= Sep 30 13:58:11 crc kubenswrapper[4763]: ++ OVNHostName= Sep 30 13:58:11 crc kubenswrapper[4763]: ++ DB_FILE=/etc/openvswitch/conf.db Sep 30 13:58:11 crc kubenswrapper[4763]: ++ ovs_dir=/var/lib/openvswitch Sep 30 13:58:11 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Sep 30 13:58:11 crc kubenswrapper[4763]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Sep 30 13:58:11 crc kubenswrapper[4763]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 30 13:58:11 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 30 13:58:11 crc kubenswrapper[4763]: + sleep 0.5 Sep 30 13:58:11 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 30 13:58:11 crc kubenswrapper[4763]: + sleep 0.5 Sep 30 13:58:11 crc kubenswrapper[4763]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 30 13:58:11 crc kubenswrapper[4763]: + cleanup_ovsdb_server_semaphore Sep 30 13:58:11 crc kubenswrapper[4763]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 30 13:58:11 crc kubenswrapper[4763]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Sep 30 13:58:11 crc kubenswrapper[4763]: > pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" containerID="cri-o://276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.515355 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" containerID="cri-o://276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" gracePeriod=29 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.541707 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.547646 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0903f-account-delete-2hlbl" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.556239 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerName="nova-metadata-log" containerID="cri-o://6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.556513 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerName="nova-metadata-metadata" containerID="cri-o://9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.567198 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibbaf-account-delete-rr8rm" Sep 30 13:58:11 crc kubenswrapper[4763]: W0930 13:58:11.590790 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7550cde2_d6ca_4dc1_8772_5eea0a9b8142.slice/crio-f3e40e4f3fde93bd5b0aacedcf9067d957fd5c3173624f059fa6c2e124e785aa WatchSource:0}: Error finding container f3e40e4f3fde93bd5b0aacedcf9067d957fd5c3173624f059fa6c2e124e785aa: Status 404 returned error can't find the container with id f3e40e4f3fde93bd5b0aacedcf9067d957fd5c3173624f059fa6c2e124e785aa Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.601640 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-djfwj_03f3de76-2dd7-4d26-8010-72d5ff408190/openstack-network-exporter/0.log" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.601700 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.611876 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-x49w7"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.637817 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-x49w7"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.641696 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.644487 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f3cc8ad7-1903-4a9f-94a4-a84f47cd1189/ovsdbserver-nb/0.log" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.644578 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.656502 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-97e0-account-create-72gbj"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.665809 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4d85-account-create-92v9x"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.673687 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-97e0-account-create-72gbj"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.685350 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4d85-account-create-92v9x"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.692654 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-dnvd2"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.704633 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-7ffj9"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.709392 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-7ffj9"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.724587 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-dnvd2"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.727908 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder97e0-account-delete-988h9"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729165 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-metrics-certs-tls-certs\") pod \"03f3de76-2dd7-4d26-8010-72d5ff408190\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729227 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729258 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-combined-ca-bundle\") pod \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729326 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovn-rundir\") pod \"03f3de76-2dd7-4d26-8010-72d5ff408190\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729355 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config-secret\") pod \"98e98c9d-b727-4c5b-857b-13064b0ef92f\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729406 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sthdv\" (UniqueName: \"kubernetes.io/projected/98e98c9d-b727-4c5b-857b-13064b0ef92f-kube-api-access-sthdv\") pod \"98e98c9d-b727-4c5b-857b-13064b0ef92f\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729441 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-scripts\") pod \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729475 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdbserver-nb-tls-certs\") pod \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729528 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-config\") pod \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729549 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmd79\" (UniqueName: \"kubernetes.io/projected/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-kube-api-access-kmd79\") pod \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729581 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03f3de76-2dd7-4d26-8010-72d5ff408190-config\") pod \"03f3de76-2dd7-4d26-8010-72d5ff408190\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729624 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdb-rundir\") pod \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729689 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-metrics-certs-tls-certs\") pod \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\" (UID: \"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729747 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config\") pod \"98e98c9d-b727-4c5b-857b-13064b0ef92f\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729766 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-combined-ca-bundle\") pod \"98e98c9d-b727-4c5b-857b-13064b0ef92f\" (UID: \"98e98c9d-b727-4c5b-857b-13064b0ef92f\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729791 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovs-rundir\") pod \"03f3de76-2dd7-4d26-8010-72d5ff408190\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729819 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-575xr\" (UniqueName: \"kubernetes.io/projected/03f3de76-2dd7-4d26-8010-72d5ff408190-kube-api-access-575xr\") pod \"03f3de76-2dd7-4d26-8010-72d5ff408190\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.729852 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-combined-ca-bundle\") pod \"03f3de76-2dd7-4d26-8010-72d5ff408190\" (UID: \"03f3de76-2dd7-4d26-8010-72d5ff408190\") " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.733030 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "03f3de76-2dd7-4d26-8010-72d5ff408190" (UID: "03f3de76-2dd7-4d26-8010-72d5ff408190"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.733360 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-config" (OuterVolumeSpecName: "config") pod "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" (UID: "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.736663 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "03f3de76-2dd7-4d26-8010-72d5ff408190" (UID: "03f3de76-2dd7-4d26-8010-72d5ff408190"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.736663 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03f3de76-2dd7-4d26-8010-72d5ff408190-config" (OuterVolumeSpecName: "config") pod "03f3de76-2dd7-4d26-8010-72d5ff408190" (UID: "03f3de76-2dd7-4d26-8010-72d5ff408190"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.737576 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" (UID: "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.738179 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-scripts" (OuterVolumeSpecName: "scripts") pod "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" (UID: "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.741483 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-kube-api-access-kmd79" (OuterVolumeSpecName: "kube-api-access-kmd79") pod "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" (UID: "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189"). InnerVolumeSpecName "kube-api-access-kmd79". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.776229 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" (UID: "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.781337 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e98c9d-b727-4c5b-857b-13064b0ef92f-kube-api-access-sthdv" (OuterVolumeSpecName: "kube-api-access-sthdv") pod "98e98c9d-b727-4c5b-857b-13064b0ef92f" (UID: "98e98c9d-b727-4c5b-857b-13064b0ef92f"). InnerVolumeSpecName "kube-api-access-sthdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.781410 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican41ee-account-delete-q7t27"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.791793 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f3de76-2dd7-4d26-8010-72d5ff408190-kube-api-access-575xr" (OuterVolumeSpecName: "kube-api-access-575xr") pod "03f3de76-2dd7-4d26-8010-72d5ff408190" (UID: "03f3de76-2dd7-4d26-8010-72d5ff408190"). InnerVolumeSpecName "kube-api-access-575xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.794235 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-41ee-account-create-256xb"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.816687 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-xxzdg"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.817561 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03f3de76-2dd7-4d26-8010-72d5ff408190" (UID: "03f3de76-2dd7-4d26-8010-72d5ff408190"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.832887 4763 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovs-rundir\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.832919 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-575xr\" (UniqueName: \"kubernetes.io/projected/03f3de76-2dd7-4d26-8010-72d5ff408190-kube-api-access-575xr\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.832930 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.832963 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.832974 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/03f3de76-2dd7-4d26-8010-72d5ff408190-ovn-rundir\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.832985 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sthdv\" (UniqueName: \"kubernetes.io/projected/98e98c9d-b727-4c5b-857b-13064b0ef92f-kube-api-access-sthdv\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.832995 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.833005 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.833016 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmd79\" (UniqueName: \"kubernetes.io/projected/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-kube-api-access-kmd79\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.833026 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03f3de76-2dd7-4d26-8010-72d5ff408190-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.833035 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.839769 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-41ee-account-create-256xb"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.849338 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-xxzdg"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.849406 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98e98c9d-b727-4c5b-857b-13064b0ef92f" (UID: "98e98c9d-b727-4c5b-857b-13064b0ef92f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.849782 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "98e98c9d-b727-4c5b-857b-13064b0ef92f" (UID: "98e98c9d-b727-4c5b-857b-13064b0ef92f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.857443 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" (UID: "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.863969 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-903f-account-create-j2fds"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.871464 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-903f-account-create-j2fds"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.872129 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="e2c5264e-b119-4444-b954-c33b428294b5" containerName="galera" containerID="cri-o://a4f61c64a8df3d9915add4b261e934b36d4aae625742c1aa68894904c7c207d4" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.876658 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.877694 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0903f-account-delete-2hlbl"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.883080 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-bbaf-account-create-dzmfz"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.888465 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-bbaf-account-create-dzmfz"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.893944 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-bj78g"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.902656 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-bj78g"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.909583 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapibbaf-account-delete-rr8rm"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.917056 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" (UID: "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.923127 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-67bf5b69fb-ff2xw"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.923370 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-67bf5b69fb-ff2xw" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerName="barbican-api-log" containerID="cri-o://e8da034aa8e3585dad3aebd273d533766fe99e7f47d29b8c0da60ce7e190c340" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.923793 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-67bf5b69fb-ff2xw" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerName="barbican-api" containerID="cri-o://59b75d8a10fde456e075a28d38cdb8ef12838b4b2acfbdbbde03b04350659d72" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.923869 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "98e98c9d-b727-4c5b-857b-13064b0ef92f" (UID: "98e98c9d-b727-4c5b-857b-13064b0ef92f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.932278 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "03f3de76-2dd7-4d26-8010-72d5ff408190" (UID: "03f3de76-2dd7-4d26-8010-72d5ff408190"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.934550 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.934581 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.934619 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98e98c9d-b727-4c5b-857b-13064b0ef92f-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.934635 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e98c9d-b727-4c5b-857b-13064b0ef92f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.934648 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/03f3de76-2dd7-4d26-8010-72d5ff408190-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.934659 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.934669 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.937726 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-9b5dc4bf7-vwl5v"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.937995 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerName="proxy-httpd" containerID="cri-o://3e0c5a3566149a9091d7d20437254c474eb77aa49ad67fd687b671660064adfb" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.938343 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerName="proxy-server" containerID="cri-o://81acae4ba8a1fe31f7b7ec84384f6c7903c26616e109d6de747404a115029a84" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.944504 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" (UID: "f3cc8ad7-1903-4a9f-94a4-a84f47cd1189"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.960389 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-95cdd9cf8-gbh25"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.960786 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" podUID="aea8c25c-f29f-49ba-ab27-87c8661479ab" containerName="barbican-keystone-listener-log" containerID="cri-o://6f12ce438cdfca7bff7ec6b8d59f8bef94cce949cecfbd974e69e68743678f6d" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.961378 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" podUID="aea8c25c-f29f-49ba-ab27-87c8661479ab" containerName="barbican-keystone-listener" containerID="cri-o://8a1c727d333559a452f33984696e78504154274594d0d689186dfd04e4589f8b" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.965798 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5f884f68c5-j4x5x"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.966104 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5f884f68c5-j4x5x" podUID="bc331486-cb31-4169-a564-51f8527ec8dd" containerName="barbican-worker-log" containerID="cri-o://c8f854a8e0e8f8b63357c20a3ee69e40c128f3f024eaa531bfc9fe89a8b73296" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.966574 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5f884f68c5-j4x5x" podUID="bc331486-cb31-4169-a564-51f8527ec8dd" containerName="barbican-worker" containerID="cri-o://42b30ec43f1257d28794be7be6660214b1f78e8dcc9ff724d26c8c28a27d8b51" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.976779 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.977020 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="783d0307-40e6-4d1e-9728-b1fe356e6b52" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4f8c5e6c6bac428024dc97ceaef682e62b42b67d2f61d3a18743765dbbf6718d" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.985648 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.986233 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c4737b72-6133-4316-8b4e-1a7a3938cd05" containerName="nova-scheduler-scheduler" containerID="cri-o://41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" gracePeriod=30 Sep 30 13:58:11 crc kubenswrapper[4763]: I0930 13:58:11.993972 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lkpxs"] Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.001831 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.002144 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="aefbc43e-494e-48a6-963c-7be9d0159387" containerName="nova-cell1-conductor-conductor" containerID="cri-o://585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1" gracePeriod=30 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.007552 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lkpxs"] Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.018963 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mgpxr"] Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.026952 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mgpxr"] Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.032418 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.032737 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="7373f404-a756-4321-bd57-e8d60585abff" containerName="nova-cell0-conductor-conductor" containerID="cri-o://605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" gracePeriod=30 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.036744 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.041930 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementb89d-account-delete-df5j7"] Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.043999 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba159e27-7a3b-4b90-a7db-de6135f8153c" containerID="1b8f0928a35a4ae56d6ef0cb85281920dd3c4a313f493d020961d40d139fa47b" exitCode=143 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.044106 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba159e27-7a3b-4b90-a7db-de6135f8153c","Type":"ContainerDied","Data":"1b8f0928a35a4ae56d6ef0cb85281920dd3c4a313f493d020961d40d139fa47b"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.050096 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance9e56-account-delete-4pzvl"] Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.056101 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.056461 4763 generic.go:334] "Generic (PLEG): container finished" podID="02c33b2c-ca4f-45a8-9920-63df9fc79108" containerID="d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.056540 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75bcdb8fc9-ml4n8" event={"ID":"02c33b2c-ca4f-45a8-9920-63df9fc79108","Type":"ContainerDied","Data":"d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105058 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="a2c552587c9daa3eff2f6b01626bdb8930edcce9d10cefa5e6f2138456bab7ae" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105091 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="2e13d0b0d4911e364ee6a3df6a55c9fe084a5532f8df7d0fcfa8239cfa1bd7d8" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105102 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="68319e480d02549a9670870fb2b799e7a229e796a6e2e64c34a0f931f5c2294a" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105110 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="242dc53e835ef062c4e6ffb487f5cb2cd09de49af6b5aef18aae943dfe19b104" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105132 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="7af9dffe8b6aec08e0ffc071adb335564cdf7ec832db594c4069392c84f63460" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105144 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="da7282808861470139cef025a99057cbd65aa13cb0bfc0317356866852f5d03d" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105151 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="580d3253de72fffc16d0a36d6429d3d8a5a8907a3681e3d5a00508e74a43aeff" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105159 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="86fa7c116448649efc303d132f55a3b4d51ce4ff7728e8cb83a546ae8cc6be04" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105167 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="17bbda96e72abf0e4fc5b512a5a8c030ec54be5d2e9697b12e425013fc6e5674" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105176 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="e59cd93a64db4c4a2e52fd8dec840f2e642f02a5898d661bf7aeab73f09ef3a3" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105183 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="6913f1a8c201da716c04a9052d361c35e1f0beafd7a800065007dd41db8b8e2f" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105192 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="1bb4e132326be55cfb6d2c02cfd640df1ebca518cc39286f9fe76a41c347dda4" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105200 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="18e1cb42d1ac47e256e5579a10290ba641e04de49adb3a3798799607a90f1b1a" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105207 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="dfe4428a4ee91686c8b839dc094b2cea3d884fe055392d209003e50ad9cecb05" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105267 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"a2c552587c9daa3eff2f6b01626bdb8930edcce9d10cefa5e6f2138456bab7ae"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105297 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"2e13d0b0d4911e364ee6a3df6a55c9fe084a5532f8df7d0fcfa8239cfa1bd7d8"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"68319e480d02549a9670870fb2b799e7a229e796a6e2e64c34a0f931f5c2294a"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105320 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"242dc53e835ef062c4e6ffb487f5cb2cd09de49af6b5aef18aae943dfe19b104"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105331 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"7af9dffe8b6aec08e0ffc071adb335564cdf7ec832db594c4069392c84f63460"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105343 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"da7282808861470139cef025a99057cbd65aa13cb0bfc0317356866852f5d03d"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105353 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"580d3253de72fffc16d0a36d6429d3d8a5a8907a3681e3d5a00508e74a43aeff"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105364 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"86fa7c116448649efc303d132f55a3b4d51ce4ff7728e8cb83a546ae8cc6be04"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105375 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"17bbda96e72abf0e4fc5b512a5a8c030ec54be5d2e9697b12e425013fc6e5674"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105385 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"e59cd93a64db4c4a2e52fd8dec840f2e642f02a5898d661bf7aeab73f09ef3a3"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"6913f1a8c201da716c04a9052d361c35e1f0beafd7a800065007dd41db8b8e2f"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105408 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"1bb4e132326be55cfb6d2c02cfd640df1ebca518cc39286f9fe76a41c347dda4"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"18e1cb42d1ac47e256e5579a10290ba641e04de49adb3a3798799607a90f1b1a"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.105431 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"dfe4428a4ee91686c8b839dc094b2cea3d884fe055392d209003e50ad9cecb05"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.138591 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-config\") pod \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.138672 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2k7f\" (UniqueName: \"kubernetes.io/projected/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-kube-api-access-h2k7f\") pod \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.138707 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-nb\") pod \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.138763 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-svc\") pod \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.139654 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-sb\") pod \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.139814 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-swift-storage-0\") pod \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\" (UID: \"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3\") " Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.145427 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-kube-api-access-h2k7f" (OuterVolumeSpecName: "kube-api-access-h2k7f") pod "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" (UID: "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3"). InnerVolumeSpecName "kube-api-access-h2k7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.168521 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.168898 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b611e133-1d4a-49a8-9632-bdb825d41fa4/ovsdbserver-sb/0.log" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.170928 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b611e133-1d4a-49a8-9632-bdb825d41fa4","Type":"ContainerDied","Data":"ecb99a945c2d9b0bf36ec0c4004dd06419870d85ada8f6a288f0b829f450fa4d"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.175444 4763 generic.go:334] "Generic (PLEG): container finished" podID="b611e133-1d4a-49a8-9632-bdb825d41fa4" containerID="ecb99a945c2d9b0bf36ec0c4004dd06419870d85ada8f6a288f0b829f450fa4d" exitCode=2 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.175489 4763 generic.go:334] "Generic (PLEG): container finished" podID="b611e133-1d4a-49a8-9632-bdb825d41fa4" containerID="f076032ba256059553984a2d073b2dcc74aadf98fe54ecddda41aaee3f716c6e" exitCode=143 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.175644 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b611e133-1d4a-49a8-9632-bdb825d41fa4","Type":"ContainerDied","Data":"f076032ba256059553984a2d073b2dcc74aadf98fe54ecddda41aaee3f716c6e"} Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.185812 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.198099 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.198185 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="aefbc43e-494e-48a6-963c-7be9d0159387" containerName="nova-cell1-conductor-conductor" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.200618 4763 generic.go:334] "Generic (PLEG): container finished" podID="8bb91013-85e0-4a13-9a06-0608b16a147b" containerID="e1f5778fa17d7cddfefc9145f7fd206fb41d3fa0e3cf06f8bd8e12eb8a451d1b" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.200689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bb91013-85e0-4a13-9a06-0608b16a147b","Type":"ContainerDied","Data":"e1f5778fa17d7cddfefc9145f7fd206fb41d3fa0e3cf06f8bd8e12eb8a451d1b"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.207746 4763 generic.go:334] "Generic (PLEG): container finished" podID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerID="963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1" exitCode=143 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.207814 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6002d74d-668d-4f30-b13a-c87ec6a8a3b8","Type":"ContainerDied","Data":"963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.209724 4763 generic.go:334] "Generic (PLEG): container finished" podID="ce55d11a-887c-46e6-af05-90c3fca01e75" containerID="3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d" exitCode=143 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.209783 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce55d11a-887c-46e6-af05-90c3fca01e75","Type":"ContainerDied","Data":"3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.211712 4763 generic.go:334] "Generic (PLEG): container finished" podID="fcb2a96e-6374-4a22-a7fd-058bfdefac42" containerID="d9fa2c70ae01b905e89f072043e631965785e292d0fdd3d7833783fc483c2de5" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.211765 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutrona40b-account-delete-z82vw" event={"ID":"fcb2a96e-6374-4a22-a7fd-058bfdefac42","Type":"ContainerDied","Data":"d9fa2c70ae01b905e89f072043e631965785e292d0fdd3d7833783fc483c2de5"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.214992 4763 generic.go:334] "Generic (PLEG): container finished" podID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerID="6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152" exitCode=143 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.215069 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"709e8d49-783d-44fb-8bcb-0b4ac2199efe","Type":"ContainerDied","Data":"6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.216961 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9e56-account-delete-4pzvl" event={"ID":"cf9f1fd7-72d5-4f11-b8c8-5e941597ca75","Type":"ContainerStarted","Data":"f4342fc5197bcad55bc24a8729b9bcabb4f803789b998265e7ff1c6f1a211663"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.217039 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance9e56-account-delete-4pzvl" podUID="cf9f1fd7-72d5-4f11-b8c8-5e941597ca75" containerName="mariadb-account-delete" containerID="cri-o://1b5be17d856f5ffa1ef3f73b9ba58a7f55f8b6924cca7141b43e2383ae2c3208" gracePeriod=30 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.220139 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementb89d-account-delete-df5j7" event={"ID":"7550cde2-d6ca-4dc1-8772-5eea0a9b8142","Type":"ContainerStarted","Data":"f3e40e4f3fde93bd5b0aacedcf9067d957fd5c3173624f059fa6c2e124e785aa"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.226665 4763 generic.go:334] "Generic (PLEG): container finished" podID="08cae05d-3853-4e7a-a66c-380c023d086b" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" exitCode=0 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.226784 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-72z5c" event={"ID":"08cae05d-3853-4e7a-a66c-380c023d086b","Type":"ContainerDied","Data":"276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.235022 4763 generic.go:334] "Generic (PLEG): container finished" podID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerID="b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11" exitCode=143 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.235102 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b321cfd6-9039-4fe6-a39c-619f101d5e30","Type":"ContainerDied","Data":"b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.245829 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2k7f\" (UniqueName: \"kubernetes.io/projected/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-kube-api-access-h2k7f\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.258260 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f3cc8ad7-1903-4a9f-94a4-a84f47cd1189/ovsdbserver-nb/0.log" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.258632 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f3cc8ad7-1903-4a9f-94a4-a84f47cd1189","Type":"ContainerDied","Data":"9a25e9cdb67390ac3b4b5e800b4a3484640138398471ec6560b2922d885c8434"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.258676 4763 scope.go:117] "RemoveContainer" containerID="823878f5a22f30c2add397afa8ccc5fd623a8d47193a01552f23a33548ef8021" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.258844 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.265069 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" (UID: "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.270315 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-config" (OuterVolumeSpecName: "config") pod "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" (UID: "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.270614 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" (UID: "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.271462 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" (UID: "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.274122 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.279848 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" (UID: "2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.283460 4763 generic.go:334] "Generic (PLEG): container finished" podID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerID="0c4968490ce8a08e1aec5c2072537900212ab6566f70cf01898816dd71f1b15c" exitCode=143 Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.283538 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b87bfdd4b-tbjxc" event={"ID":"15fbd312-35ac-4e62-ad60-ffccf94eab4a","Type":"ContainerDied","Data":"0c4968490ce8a08e1aec5c2072537900212ab6566f70cf01898816dd71f1b15c"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.307151 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placementb89d-account-delete-df5j7" podStartSLOduration=3.30713026 podStartE2EDuration="3.30713026s" podCreationTimestamp="2025-09-30 13:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:58:12.265991659 +0000 UTC m=+1364.404551954" watchObservedRunningTime="2025-09-30 13:58:12.30713026 +0000 UTC m=+1364.445690545" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.307692 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance9e56-account-delete-4pzvl" podStartSLOduration=4.307684244 podStartE2EDuration="4.307684244s" podCreationTimestamp="2025-09-30 13:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:58:12.277681151 +0000 UTC m=+1364.416241436" watchObservedRunningTime="2025-09-30 13:58:12.307684244 +0000 UTC m=+1364.446244549" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.322353 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-djfwj_03f3de76-2dd7-4d26-8010-72d5ff408190/openstack-network-exporter/0.log" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.322418 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-djfwj" event={"ID":"03f3de76-2dd7-4d26-8010-72d5ff408190","Type":"ContainerDied","Data":"fa8785e9a555620e38e2aa467798a883b6056a1798c9541327a357d6dcbc638e"} Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.322510 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.347921 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.348542 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.348560 4763 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.348572 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.348583 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.357159 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder97e0-account-delete-988h9"] Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.423437 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapibbaf-account-delete-rr8rm"] Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.456397 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.456497 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data podName:aebd5213-18eb-4d84-b39e-fd22f9ff9a6c nodeName:}" failed. No retries permitted until 2025-09-30 13:58:16.456476194 +0000 UTC m=+1368.595036469 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data") pod "rabbitmq-cell1-server-0" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c") : configmap "rabbitmq-cell1-config-data" not found Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.487322 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0903f-account-delete-2hlbl"] Sep 30 13:58:12 crc kubenswrapper[4763]: W0930 13:58:12.503994 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b9bf16b_039c_46ba_ae41_f0622530202d.slice/crio-0f14aade8b4af8f3bd745e51714b8a0fae5fba3e4e8568965f4260f4ccc70e29 WatchSource:0}: Error finding container 0f14aade8b4af8f3bd745e51714b8a0fae5fba3e4e8568965f4260f4ccc70e29: Status 404 returned error can't find the container with id 0f14aade8b4af8f3bd745e51714b8a0fae5fba3e4e8568965f4260f4ccc70e29 Sep 30 13:58:12 crc kubenswrapper[4763]: W0930 13:58:12.534759 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a588d68_fc19_4242_9b61_0ed79678fc9e.slice/crio-5f8b5de9d649736ce71949b48e43c1af95542cb542439785a04e6c37d0a3441f WatchSource:0}: Error finding container 5f8b5de9d649736ce71949b48e43c1af95542cb542439785a04e6c37d0a3441f: Status 404 returned error can't find the container with id 5f8b5de9d649736ce71949b48e43c1af95542cb542439785a04e6c37d0a3441f Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.534841 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.537862 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.539068 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.539110 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7373f404-a756-4321-bd57-e8d60585abff" containerName="nova-cell0-conductor-conductor" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.540082 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d448d7-005e-443a-9931-01565aa7a5f1" path="/var/lib/kubelet/pods/01d448d7-005e-443a-9931-01565aa7a5f1/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.540634 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="074a20c5-bc97-4a4b-9f11-60c63250120a" path="/var/lib/kubelet/pods/074a20c5-bc97-4a4b-9f11-60c63250120a/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.541131 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e" path="/var/lib/kubelet/pods/0884a0a8-c4db-4d6d-b3a1-7c6d14d97f7e/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.544856 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1942c414-009b-4326-be52-5cf277802681" path="/var/lib/kubelet/pods/1942c414-009b-4326-be52-5cf277802681/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.546470 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19caf19d-2082-4c4a-b091-54d6b3d3f1ea" path="/var/lib/kubelet/pods/19caf19d-2082-4c4a-b091-54d6b3d3f1ea/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.548962 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfc595b-d82a-432b-aae3-fbde4c86b6d9" path="/var/lib/kubelet/pods/2dfc595b-d82a-432b-aae3-fbde4c86b6d9/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.552754 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303b3bc4-dd2a-4f55-8961-31033f17652c" path="/var/lib/kubelet/pods/303b3bc4-dd2a-4f55-8961-31033f17652c/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.553560 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed6a710-9784-49a1-aa61-59c509f2ff3d" path="/var/lib/kubelet/pods/3ed6a710-9784-49a1-aa61-59c509f2ff3d/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.554590 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d7990d-14a8-4872-ad27-85dc01c63f23" path="/var/lib/kubelet/pods/56d7990d-14a8-4872-ad27-85dc01c63f23/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.555249 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62310a9e-9a81-44c2-96f2-9e7064f883e9" path="/var/lib/kubelet/pods/62310a9e-9a81-44c2-96f2-9e7064f883e9/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.556695 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ed31b0-29b4-49fb-9a0b-0c07ef07706c" path="/var/lib/kubelet/pods/64ed31b0-29b4-49fb-9a0b-0c07ef07706c/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.557245 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7033b478-9e40-41ce-9c65-80a3c5c1273f" path="/var/lib/kubelet/pods/7033b478-9e40-41ce-9c65-80a3c5c1273f/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.557820 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f4dee33-3e98-4741-9001-bb28578a2a24" path="/var/lib/kubelet/pods/7f4dee33-3e98-4741-9001-bb28578a2a24/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.558529 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9793edca-7ca4-4ccc-8448-42b6897bb3b9" path="/var/lib/kubelet/pods/9793edca-7ca4-4ccc-8448-42b6897bb3b9/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.559701 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e98c9d-b727-4c5b-857b-13064b0ef92f" path="/var/lib/kubelet/pods/98e98c9d-b727-4c5b-857b-13064b0ef92f/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.560211 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c7985fd-8e51-4b64-8477-bd05c3577312" path="/var/lib/kubelet/pods/9c7985fd-8e51-4b64-8477-bd05c3577312/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.561170 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7" path="/var/lib/kubelet/pods/c3e417ea-91e0-4cb5-baf8-d5c6bd758ae7/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.562757 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df19ca80-6959-485e-b83a-b3c643874684" path="/var/lib/kubelet/pods/df19ca80-6959-485e-b83a-b3c643874684/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.563320 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1296f29-0be2-4be4-bb9e-3670307d9d05" path="/var/lib/kubelet/pods/f1296f29-0be2-4be4-bb9e-3670307d9d05/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.563927 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f17c5ea5-8c54-4352-be0f-60a61fb6b7ba" path="/var/lib/kubelet/pods/f17c5ea5-8c54-4352-be0f-60a61fb6b7ba/volumes" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.579361 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican41ee-account-delete-q7t27"] Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.767144 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.767213 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data podName:3119638a-6580-4a24-8e7f-40f7f7d788a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:16.767199003 +0000 UTC m=+1368.905759288 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data") pod "rabbitmq-server-0" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5") : configmap "rabbitmq-config-data" not found Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.800732 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.801944 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.803906 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:58:12 crc kubenswrapper[4763]: E0930 13:58:12.804160 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c4737b72-6133-4316-8b4e-1a7a3938cd05" containerName="nova-scheduler-scheduler" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.836076 4763 scope.go:117] "RemoveContainer" containerID="780e00531373dadf7cc33ef990849299b2d53f52e2cf4a6712935bd0c2f26593" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.976136 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b611e133-1d4a-49a8-9632-bdb825d41fa4/ovsdbserver-sb/0.log" Sep 30 13:58:12 crc kubenswrapper[4763]: I0930 13:58:12.976208 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.007277 4763 scope.go:117] "RemoveContainer" containerID="e4eaf436f4bb9a039d79a107f25f1b38d03cc925e22d803d54fdd98495213540" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.073536 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-config\") pod \"b611e133-1d4a-49a8-9632-bdb825d41fa4\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.073659 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdb-rundir\") pod \"b611e133-1d4a-49a8-9632-bdb825d41fa4\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.073714 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9tjf\" (UniqueName: \"kubernetes.io/projected/b611e133-1d4a-49a8-9632-bdb825d41fa4-kube-api-access-k9tjf\") pod \"b611e133-1d4a-49a8-9632-bdb825d41fa4\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.073786 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-scripts\") pod \"b611e133-1d4a-49a8-9632-bdb825d41fa4\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.073805 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdbserver-sb-tls-certs\") pod \"b611e133-1d4a-49a8-9632-bdb825d41fa4\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.074368 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-scripts" (OuterVolumeSpecName: "scripts") pod "b611e133-1d4a-49a8-9632-bdb825d41fa4" (UID: "b611e133-1d4a-49a8-9632-bdb825d41fa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.075085 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-config" (OuterVolumeSpecName: "config") pod "b611e133-1d4a-49a8-9632-bdb825d41fa4" (UID: "b611e133-1d4a-49a8-9632-bdb825d41fa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.075745 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-combined-ca-bundle\") pod \"b611e133-1d4a-49a8-9632-bdb825d41fa4\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.075800 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-metrics-certs-tls-certs\") pod \"b611e133-1d4a-49a8-9632-bdb825d41fa4\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.075835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b611e133-1d4a-49a8-9632-bdb825d41fa4\" (UID: \"b611e133-1d4a-49a8-9632-bdb825d41fa4\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.084333 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b611e133-1d4a-49a8-9632-bdb825d41fa4-kube-api-access-k9tjf" (OuterVolumeSpecName: "kube-api-access-k9tjf") pod "b611e133-1d4a-49a8-9632-bdb825d41fa4" (UID: "b611e133-1d4a-49a8-9632-bdb825d41fa4"). InnerVolumeSpecName "kube-api-access-k9tjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.086564 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "b611e133-1d4a-49a8-9632-bdb825d41fa4" (UID: "b611e133-1d4a-49a8-9632-bdb825d41fa4"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.094534 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kwz5v" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.121624 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "b611e133-1d4a-49a8-9632-bdb825d41fa4" (UID: "b611e133-1d4a-49a8-9632-bdb825d41fa4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.143910 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b611e133-1d4a-49a8-9632-bdb825d41fa4" (UID: "b611e133-1d4a-49a8-9632-bdb825d41fa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.161886 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutrona40b-account-delete-z82vw" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.163718 4763 scope.go:117] "RemoveContainer" containerID="8a278d3c4ace4c4d3804e473924d1b56cd571d2b8cdd048d77caed140d79f478" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.191873 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs7n8\" (UniqueName: \"kubernetes.io/projected/1db73295-0655-443c-91e0-2cd08b119141-kube-api-access-rs7n8\") pod \"1db73295-0655-443c-91e0-2cd08b119141\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.192270 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-ovn-controller-tls-certs\") pod \"1db73295-0655-443c-91e0-2cd08b119141\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.192357 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-log-ovn\") pod \"1db73295-0655-443c-91e0-2cd08b119141\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.192442 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run-ovn\") pod \"1db73295-0655-443c-91e0-2cd08b119141\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.192526 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1db73295-0655-443c-91e0-2cd08b119141-scripts\") pod \"1db73295-0655-443c-91e0-2cd08b119141\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.192562 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-combined-ca-bundle\") pod \"1db73295-0655-443c-91e0-2cd08b119141\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.192584 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run\") pod \"1db73295-0655-443c-91e0-2cd08b119141\" (UID: \"1db73295-0655-443c-91e0-2cd08b119141\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.193306 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.193324 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.193349 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.193362 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b611e133-1d4a-49a8-9632-bdb825d41fa4-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.193376 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.193388 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9tjf\" (UniqueName: \"kubernetes.io/projected/b611e133-1d4a-49a8-9632-bdb825d41fa4-kube-api-access-k9tjf\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.193558 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1db73295-0655-443c-91e0-2cd08b119141" (UID: "1db73295-0655-443c-91e0-2cd08b119141"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.194865 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run" (OuterVolumeSpecName: "var-run") pod "1db73295-0655-443c-91e0-2cd08b119141" (UID: "1db73295-0655-443c-91e0-2cd08b119141"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.194919 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1db73295-0655-443c-91e0-2cd08b119141" (UID: "1db73295-0655-443c-91e0-2cd08b119141"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.196704 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1db73295-0655-443c-91e0-2cd08b119141-scripts" (OuterVolumeSpecName: "scripts") pod "1db73295-0655-443c-91e0-2cd08b119141" (UID: "1db73295-0655-443c-91e0-2cd08b119141"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.217717 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db73295-0655-443c-91e0-2cd08b119141-kube-api-access-rs7n8" (OuterVolumeSpecName: "kube-api-access-rs7n8") pod "1db73295-0655-443c-91e0-2cd08b119141" (UID: "1db73295-0655-443c-91e0-2cd08b119141"). InnerVolumeSpecName "kube-api-access-rs7n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.298501 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvbqc\" (UniqueName: \"kubernetes.io/projected/fcb2a96e-6374-4a22-a7fd-058bfdefac42-kube-api-access-kvbqc\") pod \"fcb2a96e-6374-4a22-a7fd-058bfdefac42\" (UID: \"fcb2a96e-6374-4a22-a7fd-058bfdefac42\") " Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.299879 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs7n8\" (UniqueName: \"kubernetes.io/projected/1db73295-0655-443c-91e0-2cd08b119141-kube-api-access-rs7n8\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.300128 4763 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.300196 4763 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.300248 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1db73295-0655-443c-91e0-2cd08b119141-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.300318 4763 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1db73295-0655-443c-91e0-2cd08b119141-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.335349 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb2a96e-6374-4a22-a7fd-058bfdefac42-kube-api-access-kvbqc" (OuterVolumeSpecName: "kube-api-access-kvbqc") pod "fcb2a96e-6374-4a22-a7fd-058bfdefac42" (UID: "fcb2a96e-6374-4a22-a7fd-058bfdefac42"). InnerVolumeSpecName "kube-api-access-kvbqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.344535 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b611e133-1d4a-49a8-9632-bdb825d41fa4/ovsdbserver-sb/0.log" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.344688 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.344929 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b611e133-1d4a-49a8-9632-bdb825d41fa4","Type":"ContainerDied","Data":"683e72551225fb42a0e8bddfa2b7dd515124d23141de7400e8111c20c09509bf"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.348735 4763 generic.go:334] "Generic (PLEG): container finished" podID="e2c5264e-b119-4444-b954-c33b428294b5" containerID="a4f61c64a8df3d9915add4b261e934b36d4aae625742c1aa68894904c7c207d4" exitCode=0 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.348765 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2c5264e-b119-4444-b954-c33b428294b5","Type":"ContainerDied","Data":"a4f61c64a8df3d9915add4b261e934b36d4aae625742c1aa68894904c7c207d4"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.348815 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2c5264e-b119-4444-b954-c33b428294b5","Type":"ContainerDied","Data":"026035cf8cc1caa067dae1549ea6b96a1ec29587d04378d9dc2ddf53c3015c3a"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.348828 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="026035cf8cc1caa067dae1549ea6b96a1ec29587d04378d9dc2ddf53c3015c3a" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.350721 4763 generic.go:334] "Generic (PLEG): container finished" podID="cf9f1fd7-72d5-4f11-b8c8-5e941597ca75" containerID="1b5be17d856f5ffa1ef3f73b9ba58a7f55f8b6924cca7141b43e2383ae2c3208" exitCode=0 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.350881 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9e56-account-delete-4pzvl" event={"ID":"cf9f1fd7-72d5-4f11-b8c8-5e941597ca75","Type":"ContainerDied","Data":"f4342fc5197bcad55bc24a8729b9bcabb4f803789b998265e7ff1c6f1a211663"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.350969 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4342fc5197bcad55bc24a8729b9bcabb4f803789b998265e7ff1c6f1a211663" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.351039 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9e56-account-delete-4pzvl" event={"ID":"cf9f1fd7-72d5-4f11-b8c8-5e941597ca75","Type":"ContainerDied","Data":"1b5be17d856f5ffa1ef3f73b9ba58a7f55f8b6924cca7141b43e2383ae2c3208"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.352570 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibbaf-account-delete-rr8rm" event={"ID":"0e45a139-0079-45cc-89a9-b1a0b0c1d179","Type":"ContainerStarted","Data":"253b2e156d518de184dbf86f2fdf0886c5ebe8d405e87877ee3f11ae8e20b2c8"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.355234 4763 generic.go:334] "Generic (PLEG): container finished" podID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerID="e8da034aa8e3585dad3aebd273d533766fe99e7f47d29b8c0da60ce7e190c340" exitCode=143 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.355457 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67bf5b69fb-ff2xw" event={"ID":"5ed0d19e-bbae-437d-9083-cded205c65f6","Type":"ContainerDied","Data":"e8da034aa8e3585dad3aebd273d533766fe99e7f47d29b8c0da60ce7e190c340"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.362885 4763 generic.go:334] "Generic (PLEG): container finished" podID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerID="81acae4ba8a1fe31f7b7ec84384f6c7903c26616e109d6de747404a115029a84" exitCode=0 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.362915 4763 generic.go:334] "Generic (PLEG): container finished" podID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerID="3e0c5a3566149a9091d7d20437254c474eb77aa49ad67fd687b671660064adfb" exitCode=0 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.362952 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" event={"ID":"a99f7915-f0b7-498a-941d-b02d87df4b98","Type":"ContainerDied","Data":"81acae4ba8a1fe31f7b7ec84384f6c7903c26616e109d6de747404a115029a84"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.362976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" event={"ID":"a99f7915-f0b7-498a-941d-b02d87df4b98","Type":"ContainerDied","Data":"3e0c5a3566149a9091d7d20437254c474eb77aa49ad67fd687b671660064adfb"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.362984 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" event={"ID":"a99f7915-f0b7-498a-941d-b02d87df4b98","Type":"ContainerDied","Data":"e9053a0e77e480b226c00165f33e24c7498cbace6bd9982da58ebeb4a396e7bd"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.362993 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9053a0e77e480b226c00165f33e24c7498cbace6bd9982da58ebeb4a396e7bd" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.365901 4763 generic.go:334] "Generic (PLEG): container finished" podID="bc331486-cb31-4169-a564-51f8527ec8dd" containerID="c8f854a8e0e8f8b63357c20a3ee69e40c128f3f024eaa531bfc9fe89a8b73296" exitCode=143 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.365973 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f884f68c5-j4x5x" event={"ID":"bc331486-cb31-4169-a564-51f8527ec8dd","Type":"ContainerDied","Data":"c8f854a8e0e8f8b63357c20a3ee69e40c128f3f024eaa531bfc9fe89a8b73296"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.367744 4763 generic.go:334] "Generic (PLEG): container finished" podID="aea8c25c-f29f-49ba-ab27-87c8661479ab" containerID="6f12ce438cdfca7bff7ec6b8d59f8bef94cce949cecfbd974e69e68743678f6d" exitCode=143 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.367782 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" event={"ID":"aea8c25c-f29f-49ba-ab27-87c8661479ab","Type":"ContainerDied","Data":"6f12ce438cdfca7bff7ec6b8d59f8bef94cce949cecfbd974e69e68743678f6d"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.369232 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder97e0-account-delete-988h9" event={"ID":"ae274648-abe2-416e-a43d-edc836edc424","Type":"ContainerStarted","Data":"622d618778d05bbbdd744c9c116177e9d0287d466cbd347edf7f53b1fe93641e"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.370570 4763 generic.go:334] "Generic (PLEG): container finished" podID="7550cde2-d6ca-4dc1-8772-5eea0a9b8142" containerID="79023af8bc518b2c3e3c55cdf8786ec0640148a9740e74d8f02ab74827d3004d" exitCode=0 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.370629 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementb89d-account-delete-df5j7" event={"ID":"7550cde2-d6ca-4dc1-8772-5eea0a9b8142","Type":"ContainerDied","Data":"79023af8bc518b2c3e3c55cdf8786ec0640148a9740e74d8f02ab74827d3004d"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.374207 4763 generic.go:334] "Generic (PLEG): container finished" podID="783d0307-40e6-4d1e-9728-b1fe356e6b52" containerID="4f8c5e6c6bac428024dc97ceaef682e62b42b67d2f61d3a18743765dbbf6718d" exitCode=0 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.374393 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"783d0307-40e6-4d1e-9728-b1fe356e6b52","Type":"ContainerDied","Data":"4f8c5e6c6bac428024dc97ceaef682e62b42b67d2f61d3a18743765dbbf6718d"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.374579 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"783d0307-40e6-4d1e-9728-b1fe356e6b52","Type":"ContainerDied","Data":"f2eb8078d6b11977d27a5d4c8403e9ce60697f994c479ce5edef18d8ebe98011"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.374685 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2eb8078d6b11977d27a5d4c8403e9ce60697f994c479ce5edef18d8ebe98011" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.379273 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican41ee-account-delete-q7t27" event={"ID":"4a588d68-fc19-4242-9b61-0ed79678fc9e","Type":"ContainerStarted","Data":"5f8b5de9d649736ce71949b48e43c1af95542cb542439785a04e6c37d0a3441f"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.382037 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0903f-account-delete-2hlbl" event={"ID":"7b9bf16b-039c-46ba-ae41-f0622530202d","Type":"ContainerStarted","Data":"0f14aade8b4af8f3bd745e51714b8a0fae5fba3e4e8568965f4260f4ccc70e29"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.396484 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1db73295-0655-443c-91e0-2cd08b119141" (UID: "1db73295-0655-443c-91e0-2cd08b119141"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.402072 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvbqc\" (UniqueName: \"kubernetes.io/projected/fcb2a96e-6374-4a22-a7fd-058bfdefac42-kube-api-access-kvbqc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.402220 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.408942 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" event={"ID":"2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3","Type":"ContainerDied","Data":"99dd7c5c327df9d137d54643d744320fd2f62ce8f9b98f591caa3b4183e23a8a"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.409052 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc449b9dc-ktclz" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.422720 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kwz5v" event={"ID":"1db73295-0655-443c-91e0-2cd08b119141","Type":"ContainerDied","Data":"9a3167b2cef40c711d40ef58feda0d96b9fafe2cc7ae9998185de46719f43773"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.422806 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kwz5v" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.431634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutrona40b-account-delete-z82vw" event={"ID":"fcb2a96e-6374-4a22-a7fd-058bfdefac42","Type":"ContainerDied","Data":"cfd4f851da045307bf1d532bed1c20be921d4976f86033d85efe8344a18298b1"} Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.431728 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutrona40b-account-delete-z82vw" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.448960 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.493100 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "b611e133-1d4a-49a8-9632-bdb825d41fa4" (UID: "b611e133-1d4a-49a8-9632-bdb825d41fa4"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.509222 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.509253 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.519543 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b611e133-1d4a-49a8-9632-bdb825d41fa4" (UID: "b611e133-1d4a-49a8-9632-bdb825d41fa4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.552270 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "1db73295-0655-443c-91e0-2cd08b119141" (UID: "1db73295-0655-443c-91e0-2cd08b119141"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.610813 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db73295-0655-443c-91e0-2cd08b119141-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.610843 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b611e133-1d4a-49a8-9632-bdb825d41fa4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.788542 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.789470 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="ceilometer-central-agent" containerID="cri-o://4050fb5fd9697e750ea813cde28a4d185f69fe05aa260d270466a73fd43cd815" gracePeriod=30 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.790589 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="proxy-httpd" containerID="cri-o://853ee2a8bbade3ee4f3a22fb7a290fda5a987ec05e7c37f8d30f4d981573d587" gracePeriod=30 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.790669 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="sg-core" containerID="cri-o://e4e037d3269bde82dabfcdad3ed6cb4f0b5a8c6d24b989ac4a2b7ef124409e4b" gracePeriod=30 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.790700 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="ceilometer-notification-agent" containerID="cri-o://9f9834fea4bcdd3ebd6df5293d9dbc06c9286884eddc735ba525f57b0060ea83" gracePeriod=30 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.845704 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.845957 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" containerName="kube-state-metrics" containerID="cri-o://a9c41d23775fc1eb5013bbc346ef1bd994014c9076f54a0e50116c9cc0474cc7" gracePeriod=30 Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.966524 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Sep 30 13:58:13 crc kubenswrapper[4763]: I0930 13:58:13.968578 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="e5f7940e-dedf-45a0-97b4-dc825dc00fc5" containerName="memcached" containerID="cri-o://8d9e3ab86dc859f16e88025097f97a98ba29d69374fe2446837e45205f560afd" gracePeriod=30 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.045562 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jpdl6"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.057642 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jpdl6"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.070173 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-7hcz2"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.079982 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-7hcz2"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.085464 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5775d899cd-b25ch"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.087981 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5775d899cd-b25ch" podUID="0cf247fc-bc61-4305-b8a5-19ac60eba62a" containerName="keystone-api" containerID="cri-o://ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4" gracePeriod=30 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.139928 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance9e56-account-delete-4pzvl" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.144204 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.170137 4763 scope.go:117] "RemoveContainer" containerID="ecb99a945c2d9b0bf36ec0c4004dd06419870d85ada8f6a288f0b829f450fa4d" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.170920 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.187656 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jpst6"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.188988 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.207112 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jpst6"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.227940 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.228567 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-fac2-account-create-757bn"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.228617 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-5b87bfdd4b-tbjxc" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.147:8778/\": read tcp 10.217.0.2:43134->10.217.0.147:8778: read: connection reset by peer" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.228703 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-5b87bfdd4b-tbjxc" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.147:8778/\": read tcp 10.217.0.2:43140->10.217.0.147:8778: read: connection reset by peer" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.244006 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qtqv\" (UniqueName: \"kubernetes.io/projected/cf9f1fd7-72d5-4f11-b8c8-5e941597ca75-kube-api-access-6qtqv\") pod \"cf9f1fd7-72d5-4f11-b8c8-5e941597ca75\" (UID: \"cf9f1fd7-72d5-4f11-b8c8-5e941597ca75\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.246039 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-fac2-account-create-757bn"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.263046 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-d5jt9"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.266644 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-d5jt9"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.271901 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9f1fd7-72d5-4f11-b8c8-5e941597ca75-kube-api-access-6qtqv" (OuterVolumeSpecName: "kube-api-access-6qtqv") pod "cf9f1fd7-72d5-4f11-b8c8-5e941597ca75" (UID: "cf9f1fd7-72d5-4f11-b8c8-5e941597ca75"). InnerVolumeSpecName "kube-api-access-6qtqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.288395 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementb89d-account-delete-df5j7"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.298632 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b89d-account-create-d82w4"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.306682 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b89d-account-create-d82w4"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.335791 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutrona40b-account-delete-z82vw"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.346389 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutrona40b-account-delete-z82vw"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347438 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-nova-novncproxy-tls-certs\") pod \"783d0307-40e6-4d1e-9728-b1fe356e6b52\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347508 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-combined-ca-bundle\") pod \"783d0307-40e6-4d1e-9728-b1fe356e6b52\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347539 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-internal-tls-certs\") pod \"a99f7915-f0b7-498a-941d-b02d87df4b98\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347560 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-etc-swift\") pod \"a99f7915-f0b7-498a-941d-b02d87df4b98\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347579 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-log-httpd\") pod \"a99f7915-f0b7-498a-941d-b02d87df4b98\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347626 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-secrets\") pod \"e2c5264e-b119-4444-b954-c33b428294b5\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347641 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-config-data\") pod \"a99f7915-f0b7-498a-941d-b02d87df4b98\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347661 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-combined-ca-bundle\") pod \"a99f7915-f0b7-498a-941d-b02d87df4b98\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347680 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2t89\" (UniqueName: \"kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-kube-api-access-d2t89\") pod \"a99f7915-f0b7-498a-941d-b02d87df4b98\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347712 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-vencrypt-tls-certs\") pod \"783d0307-40e6-4d1e-9728-b1fe356e6b52\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347757 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-operator-scripts\") pod \"e2c5264e-b119-4444-b954-c33b428294b5\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347794 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-public-tls-certs\") pod \"a99f7915-f0b7-498a-941d-b02d87df4b98\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347817 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-kolla-config\") pod \"e2c5264e-b119-4444-b954-c33b428294b5\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347846 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sktq\" (UniqueName: \"kubernetes.io/projected/783d0307-40e6-4d1e-9728-b1fe356e6b52-kube-api-access-4sktq\") pod \"783d0307-40e6-4d1e-9728-b1fe356e6b52\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347870 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j6rf\" (UniqueName: \"kubernetes.io/projected/e2c5264e-b119-4444-b954-c33b428294b5-kube-api-access-6j6rf\") pod \"e2c5264e-b119-4444-b954-c33b428294b5\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347898 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2c5264e-b119-4444-b954-c33b428294b5-config-data-generated\") pod \"e2c5264e-b119-4444-b954-c33b428294b5\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347931 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e2c5264e-b119-4444-b954-c33b428294b5\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347953 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-combined-ca-bundle\") pod \"e2c5264e-b119-4444-b954-c33b428294b5\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347968 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-config-data-default\") pod \"e2c5264e-b119-4444-b954-c33b428294b5\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.347995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-config-data\") pod \"783d0307-40e6-4d1e-9728-b1fe356e6b52\" (UID: \"783d0307-40e6-4d1e-9728-b1fe356e6b52\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.348016 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-run-httpd\") pod \"a99f7915-f0b7-498a-941d-b02d87df4b98\" (UID: \"a99f7915-f0b7-498a-941d-b02d87df4b98\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.348045 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-galera-tls-certs\") pod \"e2c5264e-b119-4444-b954-c33b428294b5\" (UID: \"e2c5264e-b119-4444-b954-c33b428294b5\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.348425 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qtqv\" (UniqueName: \"kubernetes.io/projected/cf9f1fd7-72d5-4f11-b8c8-5e941597ca75-kube-api-access-6qtqv\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.351162 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2c5264e-b119-4444-b954-c33b428294b5" (UID: "e2c5264e-b119-4444-b954-c33b428294b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.353832 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2c5264e-b119-4444-b954-c33b428294b5-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e2c5264e-b119-4444-b954-c33b428294b5" (UID: "e2c5264e-b119-4444-b954-c33b428294b5"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.354263 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e2c5264e-b119-4444-b954-c33b428294b5" (UID: "e2c5264e-b119-4444-b954-c33b428294b5"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.358514 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a99f7915-f0b7-498a-941d-b02d87df4b98" (UID: "a99f7915-f0b7-498a-941d-b02d87df4b98"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.366964 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a99f7915-f0b7-498a-941d-b02d87df4b98" (UID: "a99f7915-f0b7-498a-941d-b02d87df4b98"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.367498 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e2c5264e-b119-4444-b954-c33b428294b5" (UID: "e2c5264e-b119-4444-b954-c33b428294b5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.384829 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-ktclz"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.406819 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-ktclz"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.413708 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-kube-api-access-d2t89" (OuterVolumeSpecName: "kube-api-access-d2t89") pod "a99f7915-f0b7-498a-941d-b02d87df4b98" (UID: "a99f7915-f0b7-498a-941d-b02d87df4b98"). InnerVolumeSpecName "kube-api-access-d2t89". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.413827 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a99f7915-f0b7-498a-941d-b02d87df4b98" (UID: "a99f7915-f0b7-498a-941d-b02d87df4b98"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.413916 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-secrets" (OuterVolumeSpecName: "secrets") pod "e2c5264e-b119-4444-b954-c33b428294b5" (UID: "e2c5264e-b119-4444-b954-c33b428294b5"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.413998 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.171:8776/healthcheck\": read tcp 10.217.0.2:49044->10.217.0.171:8776: read: connection reset by peer" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.417193 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.419209 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c5264e-b119-4444-b954-c33b428294b5-kube-api-access-6j6rf" (OuterVolumeSpecName: "kube-api-access-6j6rf") pod "e2c5264e-b119-4444-b954-c33b428294b5" (UID: "e2c5264e-b119-4444-b954-c33b428294b5"). InnerVolumeSpecName "kube-api-access-6j6rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.420940 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783d0307-40e6-4d1e-9728-b1fe356e6b52-kube-api-access-4sktq" (OuterVolumeSpecName: "kube-api-access-4sktq") pod "783d0307-40e6-4d1e-9728-b1fe356e6b52" (UID: "783d0307-40e6-4d1e-9728-b1fe356e6b52"). InnerVolumeSpecName "kube-api-access-4sktq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.450044 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.450819 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-config-data" (OuterVolumeSpecName: "config-data") pod "783d0307-40e6-4d1e-9728-b1fe356e6b52" (UID: "783d0307-40e6-4d1e-9728-b1fe356e6b52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.475623 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "e2c5264e-b119-4444-b954-c33b428294b5" (UID: "e2c5264e-b119-4444-b954-c33b428294b5"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.479137 4763 generic.go:334] "Generic (PLEG): container finished" podID="4a588d68-fc19-4242-9b61-0ed79678fc9e" containerID="9355d58d746614fcc1470f5ee2a3cbecebd32880006f604b38e2e1f09b1a4a21" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.479238 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican41ee-account-delete-q7t27" event={"ID":"4a588d68-fc19-4242-9b61-0ed79678fc9e","Type":"ContainerDied","Data":"9355d58d746614fcc1470f5ee2a3cbecebd32880006f604b38e2e1f09b1a4a21"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.485532 4763 generic.go:334] "Generic (PLEG): container finished" podID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerID="6e00eb474337eb85a3ae6ce678a0a8afddc2bad42ef7bdbf41de0b427ce3b086" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.485631 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b87bfdd4b-tbjxc" event={"ID":"15fbd312-35ac-4e62-ad60-ffccf94eab4a","Type":"ContainerDied","Data":"6e00eb474337eb85a3ae6ce678a0a8afddc2bad42ef7bdbf41de0b427ce3b086"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.489156 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kwz5v"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.494847 4763 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.494877 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.494891 4763 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.494904 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2t89\" (UniqueName: \"kubernetes.io/projected/a99f7915-f0b7-498a-941d-b02d87df4b98-kube-api-access-d2t89\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.494918 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.494930 4763 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-kolla-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.494941 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sktq\" (UniqueName: \"kubernetes.io/projected/783d0307-40e6-4d1e-9728-b1fe356e6b52-kube-api-access-4sktq\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.494954 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j6rf\" (UniqueName: \"kubernetes.io/projected/e2c5264e-b119-4444-b954-c33b428294b5-kube-api-access-6j6rf\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.494965 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2c5264e-b119-4444-b954-c33b428294b5-config-data-generated\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.494989 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.495000 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2c5264e-b119-4444-b954-c33b428294b5-config-data-default\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.495009 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.495019 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a99f7915-f0b7-498a-941d-b02d87df4b98-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.502847 4763 generic.go:334] "Generic (PLEG): container finished" podID="bc331486-cb31-4169-a564-51f8527ec8dd" containerID="42b30ec43f1257d28794be7be6660214b1f78e8dcc9ff724d26c8c28a27d8b51" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.511757 4763 generic.go:334] "Generic (PLEG): container finished" podID="0e45a139-0079-45cc-89a9-b1a0b0c1d179" containerID="bd91d5f1697ca42a565bdfcc3ac0d792de9c64aaa05b55fe8b3688f3e91d36cd" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.520177 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" path="/var/lib/kubelet/pods/2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3/volumes" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.520996 4763 generic.go:334] "Generic (PLEG): container finished" podID="8bb91013-85e0-4a13-9a06-0608b16a147b" containerID="cba654c3201589fdfeff007899f0f73ab1e63e34fc4ccb4f54ba1464e5755d0a" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.523878 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="573ae1be-3060-4fea-a7fe-b7feeaa60cc7" path="/var/lib/kubelet/pods/573ae1be-3060-4fea-a7fe-b7feeaa60cc7/volumes" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.526085 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5af7ed04-4a1d-4283-b6c3-e9afb4ee4675" path="/var/lib/kubelet/pods/5af7ed04-4a1d-4283-b6c3-e9afb4ee4675/volumes" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.531015 4763 generic.go:334] "Generic (PLEG): container finished" podID="ae274648-abe2-416e-a43d-edc836edc424" containerID="001400b738e180e3a088717caa1a086bcf8cc3e72a547b2cbd98bcaa871db31d" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.531127 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3eeb39-e6e2-4824-8022-fd652d13ed03" path="/var/lib/kubelet/pods/6a3eeb39-e6e2-4824-8022-fd652d13ed03/volumes" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.537425 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d37da4a-5377-4d05-93c5-04f933f77894" path="/var/lib/kubelet/pods/9d37da4a-5377-4d05-93c5-04f933f77894/volumes" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.538242 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b611e133-1d4a-49a8-9632-bdb825d41fa4" path="/var/lib/kubelet/pods/b611e133-1d4a-49a8-9632-bdb825d41fa4/volumes" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.540872 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db237d2f-d736-42bd-bd84-dc9d93909367" path="/var/lib/kubelet/pods/db237d2f-d736-42bd-bd84-dc9d93909367/volumes" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.542970 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6826233-27ed-40bc-9d7b-92312272f1e5" path="/var/lib/kubelet/pods/f6826233-27ed-40bc-9d7b-92312272f1e5/volumes" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.544045 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb2a96e-6374-4a22-a7fd-058bfdefac42" path="/var/lib/kubelet/pods/fcb2a96e-6374-4a22-a7fd-058bfdefac42/volumes" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.556794 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba159e27-7a3b-4b90-a7db-de6135f8153c" containerID="cb40ecb42c9c7e13873d46e9437c88735cec11180b4edf02393f88c403b8189b" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.569856 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a99f7915-f0b7-498a-941d-b02d87df4b98" (UID: "a99f7915-f0b7-498a-941d-b02d87df4b98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.592505 4763 generic.go:334] "Generic (PLEG): container finished" podID="7b9bf16b-039c-46ba-ae41-f0622530202d" containerID="f04f15127c26fdd23725e119940adc91f2443e20537485e0fcc64c95fdba4519" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.596731 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "783d0307-40e6-4d1e-9728-b1fe356e6b52" (UID: "783d0307-40e6-4d1e-9728-b1fe356e6b52"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.597165 4763 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.597193 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.629807 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.635808 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-config-data" (OuterVolumeSpecName: "config-data") pod "a99f7915-f0b7-498a-941d-b02d87df4b98" (UID: "a99f7915-f0b7-498a-941d-b02d87df4b98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.649443 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "783d0307-40e6-4d1e-9728-b1fe356e6b52" (UID: "783d0307-40e6-4d1e-9728-b1fe356e6b52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.650469 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a99f7915-f0b7-498a-941d-b02d87df4b98" (UID: "a99f7915-f0b7-498a-941d-b02d87df4b98"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.650827 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="e5f7940e-dedf-45a0-97b4-dc825dc00fc5" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.106:11211: connect: connection refused" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.671394 4763 generic.go:334] "Generic (PLEG): container finished" podID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerID="853ee2a8bbade3ee4f3a22fb7a290fda5a987ec05e7c37f8d30f4d981573d587" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.671468 4763 generic.go:334] "Generic (PLEG): container finished" podID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerID="e4e037d3269bde82dabfcdad3ed6cb4f0b5a8c6d24b989ac4a2b7ef124409e4b" exitCode=2 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.690125 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "e2c5264e-b119-4444-b954-c33b428294b5" (UID: "e2c5264e-b119-4444-b954-c33b428294b5"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.692110 4763 generic.go:334] "Generic (PLEG): container finished" podID="aea8c25c-f29f-49ba-ab27-87c8661479ab" containerID="8a1c727d333559a452f33984696e78504154274594d0d689186dfd04e4589f8b" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.699699 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.699740 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.699754 4763 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.699763 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.699777 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.706053 4763 generic.go:334] "Generic (PLEG): container finished" podID="a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" containerID="a9c41d23775fc1eb5013bbc346ef1bd994014c9076f54a0e50116c9cc0474cc7" exitCode=2 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.706148 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance9e56-account-delete-4pzvl" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.709572 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.709922 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.710364 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.725150 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a99f7915-f0b7-498a-941d-b02d87df4b98" (UID: "a99f7915-f0b7-498a-941d-b02d87df4b98"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.759295 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2c5264e-b119-4444-b954-c33b428294b5" (UID: "e2c5264e-b119-4444-b954-c33b428294b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.772793 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="2b7af94e-accb-45ca-af30-c489c8d77b12" containerName="galera" containerID="cri-o://5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97" gracePeriod=30 Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.786644 4763 scope.go:117] "RemoveContainer" containerID="f076032ba256059553984a2d073b2dcc74aadf98fe54ecddda41aaee3f716c6e" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.789046 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "783d0307-40e6-4d1e-9728-b1fe356e6b52" (UID: "783d0307-40e6-4d1e-9728-b1fe356e6b52"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.801745 4763 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d0307-40e6-4d1e-9728-b1fe356e6b52-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.801786 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c5264e-b119-4444-b954-c33b428294b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.801796 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a99f7915-f0b7-498a-941d-b02d87df4b98-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858307 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kwz5v"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858352 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f884f68c5-j4x5x" event={"ID":"bc331486-cb31-4169-a564-51f8527ec8dd","Type":"ContainerDied","Data":"42b30ec43f1257d28794be7be6660214b1f78e8dcc9ff724d26c8c28a27d8b51"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f884f68c5-j4x5x" event={"ID":"bc331486-cb31-4169-a564-51f8527ec8dd","Type":"ContainerDied","Data":"7e847ac3e8459d07c05fadee37d60c5c57e7e16c7493a39a9aba11429525807c"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858491 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e847ac3e8459d07c05fadee37d60c5c57e7e16c7493a39a9aba11429525807c" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858580 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibbaf-account-delete-rr8rm" event={"ID":"0e45a139-0079-45cc-89a9-b1a0b0c1d179","Type":"ContainerDied","Data":"bd91d5f1697ca42a565bdfcc3ac0d792de9c64aaa05b55fe8b3688f3e91d36cd"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858628 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bb91013-85e0-4a13-9a06-0608b16a147b","Type":"ContainerDied","Data":"cba654c3201589fdfeff007899f0f73ab1e63e34fc4ccb4f54ba1464e5755d0a"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858644 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder97e0-account-delete-988h9" event={"ID":"ae274648-abe2-416e-a43d-edc836edc424","Type":"ContainerDied","Data":"001400b738e180e3a088717caa1a086bcf8cc3e72a547b2cbd98bcaa871db31d"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858657 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba159e27-7a3b-4b90-a7db-de6135f8153c","Type":"ContainerDied","Data":"cb40ecb42c9c7e13873d46e9437c88735cec11180b4edf02393f88c403b8189b"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858676 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0903f-account-delete-2hlbl" event={"ID":"7b9bf16b-039c-46ba-ae41-f0622530202d","Type":"ContainerDied","Data":"f04f15127c26fdd23725e119940adc91f2443e20537485e0fcc64c95fdba4519"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858728 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dafb3edf-a4c0-4131-ad09-f836de63ff6b","Type":"ContainerDied","Data":"853ee2a8bbade3ee4f3a22fb7a290fda5a987ec05e7c37f8d30f4d981573d587"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858744 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dafb3edf-a4c0-4131-ad09-f836de63ff6b","Type":"ContainerDied","Data":"e4e037d3269bde82dabfcdad3ed6cb4f0b5a8c6d24b989ac4a2b7ef124409e4b"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858753 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" event={"ID":"aea8c25c-f29f-49ba-ab27-87c8661479ab","Type":"ContainerDied","Data":"8a1c727d333559a452f33984696e78504154274594d0d689186dfd04e4589f8b"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858765 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" event={"ID":"aea8c25c-f29f-49ba-ab27-87c8661479ab","Type":"ContainerDied","Data":"a4d7497733080914437aa44414c72e1bd14ae53940dcea5e77876877cb23fa76"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858773 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4d7497733080914437aa44414c72e1bd14ae53940dcea5e77876877cb23fa76" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.858792 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0","Type":"ContainerDied","Data":"a9c41d23775fc1eb5013bbc346ef1bd994014c9076f54a0e50116c9cc0474cc7"} Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.868276 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.903122 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ll6q\" (UniqueName: \"kubernetes.io/projected/bc331486-cb31-4169-a564-51f8527ec8dd-kube-api-access-8ll6q\") pod \"bc331486-cb31-4169-a564-51f8527ec8dd\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.903184 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data\") pod \"bc331486-cb31-4169-a564-51f8527ec8dd\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.903242 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-combined-ca-bundle\") pod \"bc331486-cb31-4169-a564-51f8527ec8dd\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.903309 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data-custom\") pod \"bc331486-cb31-4169-a564-51f8527ec8dd\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.903338 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc331486-cb31-4169-a564-51f8527ec8dd-logs\") pod \"bc331486-cb31-4169-a564-51f8527ec8dd\" (UID: \"bc331486-cb31-4169-a564-51f8527ec8dd\") " Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.904167 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc331486-cb31-4169-a564-51f8527ec8dd-logs" (OuterVolumeSpecName: "logs") pod "bc331486-cb31-4169-a564-51f8527ec8dd" (UID: "bc331486-cb31-4169-a564-51f8527ec8dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.907339 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc331486-cb31-4169-a564-51f8527ec8dd-kube-api-access-8ll6q" (OuterVolumeSpecName: "kube-api-access-8ll6q") pod "bc331486-cb31-4169-a564-51f8527ec8dd" (UID: "bc331486-cb31-4169-a564-51f8527ec8dd"). InnerVolumeSpecName "kube-api-access-8ll6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.931154 4763 scope.go:117] "RemoveContainer" containerID="f18e050afde37900d0b00f1f42394f96b83e1b630126fc3ff1f6312b776bc3ae" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.941280 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bc331486-cb31-4169-a564-51f8527ec8dd" (UID: "bc331486-cb31-4169-a564-51f8527ec8dd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.945097 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.945717 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.952321 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.969748 4763 scope.go:117] "RemoveContainer" containerID="2589b94c1779aa45bfe6bc84dd80bb5efa5a6559e17433f8f4c0ba2be4f7b26d" Sep 30 13:58:14 crc kubenswrapper[4763]: E0930 13:58:14.971791 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:14 crc kubenswrapper[4763]: E0930 13:58:14.975184 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.990854 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc331486-cb31-4169-a564-51f8527ec8dd" (UID: "bc331486-cb31-4169-a564-51f8527ec8dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:14 crc kubenswrapper[4763]: E0930 13:58:14.991030 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:14 crc kubenswrapper[4763]: E0930 13:58:14.991131 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" Sep 30 13:58:14 crc kubenswrapper[4763]: I0930 13:58:14.998361 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data" (OuterVolumeSpecName: "config-data") pod "bc331486-cb31-4169-a564-51f8527ec8dd" (UID: "bc331486-cb31-4169-a564-51f8527ec8dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.006923 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-certs\") pod \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007076 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data-custom\") pod \"aea8c25c-f29f-49ba-ab27-87c8661479ab\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007119 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data-custom\") pod \"8bb91013-85e0-4a13-9a06-0608b16a147b\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007156 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea8c25c-f29f-49ba-ab27-87c8661479ab-logs\") pod \"aea8c25c-f29f-49ba-ab27-87c8661479ab\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007186 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-combined-ca-bundle\") pod \"aea8c25c-f29f-49ba-ab27-87c8661479ab\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007266 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data\") pod \"aea8c25c-f29f-49ba-ab27-87c8661479ab\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007296 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-config\") pod \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007389 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data\") pod \"8bb91013-85e0-4a13-9a06-0608b16a147b\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007413 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9th2\" (UniqueName: \"kubernetes.io/projected/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-api-access-m9th2\") pod \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007496 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7gtt\" (UniqueName: \"kubernetes.io/projected/8bb91013-85e0-4a13-9a06-0608b16a147b-kube-api-access-z7gtt\") pod \"8bb91013-85e0-4a13-9a06-0608b16a147b\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007553 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bb91013-85e0-4a13-9a06-0608b16a147b-etc-machine-id\") pod \"8bb91013-85e0-4a13-9a06-0608b16a147b\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007579 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-combined-ca-bundle\") pod \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007634 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-scripts\") pod \"8bb91013-85e0-4a13-9a06-0608b16a147b\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007668 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-combined-ca-bundle\") pod \"8bb91013-85e0-4a13-9a06-0608b16a147b\" (UID: \"8bb91013-85e0-4a13-9a06-0608b16a147b\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.007710 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7dfv\" (UniqueName: \"kubernetes.io/projected/aea8c25c-f29f-49ba-ab27-87c8661479ab-kube-api-access-t7dfv\") pod \"aea8c25c-f29f-49ba-ab27-87c8661479ab\" (UID: \"aea8c25c-f29f-49ba-ab27-87c8661479ab\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.008715 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ll6q\" (UniqueName: \"kubernetes.io/projected/bc331486-cb31-4169-a564-51f8527ec8dd-kube-api-access-8ll6q\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.008750 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.008763 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.008773 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc331486-cb31-4169-a564-51f8527ec8dd-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.008784 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc331486-cb31-4169-a564-51f8527ec8dd-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.009694 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aea8c25c-f29f-49ba-ab27-87c8661479ab-logs" (OuterVolumeSpecName: "logs") pod "aea8c25c-f29f-49ba-ab27-87c8661479ab" (UID: "aea8c25c-f29f-49ba-ab27-87c8661479ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.010265 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb91013-85e0-4a13-9a06-0608b16a147b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8bb91013-85e0-4a13-9a06-0608b16a147b" (UID: "8bb91013-85e0-4a13-9a06-0608b16a147b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.014501 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea8c25c-f29f-49ba-ab27-87c8661479ab-kube-api-access-t7dfv" (OuterVolumeSpecName: "kube-api-access-t7dfv") pod "aea8c25c-f29f-49ba-ab27-87c8661479ab" (UID: "aea8c25c-f29f-49ba-ab27-87c8661479ab"). InnerVolumeSpecName "kube-api-access-t7dfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.014917 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb91013-85e0-4a13-9a06-0608b16a147b-kube-api-access-z7gtt" (OuterVolumeSpecName: "kube-api-access-z7gtt") pod "8bb91013-85e0-4a13-9a06-0608b16a147b" (UID: "8bb91013-85e0-4a13-9a06-0608b16a147b"). InnerVolumeSpecName "kube-api-access-z7gtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.015828 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.016087 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aea8c25c-f29f-49ba-ab27-87c8661479ab" (UID: "aea8c25c-f29f-49ba-ab27-87c8661479ab"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: E0930 13:58:15.018786 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.022182 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8bb91013-85e0-4a13-9a06-0608b16a147b" (UID: "8bb91013-85e0-4a13-9a06-0608b16a147b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: E0930 13:58:15.024009 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:15 crc kubenswrapper[4763]: E0930 13:58:15.034736 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:15 crc kubenswrapper[4763]: E0930 13:58:15.034811 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovs-vswitchd" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.035208 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-api-access-m9th2" (OuterVolumeSpecName: "kube-api-access-m9th2") pod "a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" (UID: "a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0"). InnerVolumeSpecName "kube-api-access-m9th2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.056709 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-scripts" (OuterVolumeSpecName: "scripts") pod "8bb91013-85e0-4a13-9a06-0608b16a147b" (UID: "8bb91013-85e0-4a13-9a06-0608b16a147b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.064818 4763 scope.go:117] "RemoveContainer" containerID="6929f8af8d3cf797dc4b407e18a7a6d4c22dc654105d94f4fa1d84446a16b519" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.091401 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-9b5dc4bf7-vwl5v"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.106594 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-9b5dc4bf7-vwl5v"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.109927 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-internal-tls-certs\") pod \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110081 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-combined-ca-bundle\") pod \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110150 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fbd312-35ac-4e62-ad60-ffccf94eab4a-logs\") pod \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110172 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-public-tls-certs\") pod \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110192 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-scripts\") pod \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110390 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4crl\" (UniqueName: \"kubernetes.io/projected/15fbd312-35ac-4e62-ad60-ffccf94eab4a-kube-api-access-g4crl\") pod \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110452 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-config-data\") pod \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\" (UID: \"15fbd312-35ac-4e62-ad60-ffccf94eab4a\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110903 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7gtt\" (UniqueName: \"kubernetes.io/projected/8bb91013-85e0-4a13-9a06-0608b16a147b-kube-api-access-z7gtt\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110922 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bb91013-85e0-4a13-9a06-0608b16a147b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110933 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110959 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7dfv\" (UniqueName: \"kubernetes.io/projected/aea8c25c-f29f-49ba-ab27-87c8661479ab-kube-api-access-t7dfv\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110972 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110982 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.110992 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea8c25c-f29f-49ba-ab27-87c8661479ab-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.111004 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9th2\" (UniqueName: \"kubernetes.io/projected/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-api-access-m9th2\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.113285 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-67bf5b69fb-ff2xw" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:35026->10.217.0.158:9311: read: connection reset by peer" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.113451 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-67bf5b69fb-ff2xw" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:35024->10.217.0.158:9311: read: connection reset by peer" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.115686 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fbd312-35ac-4e62-ad60-ffccf94eab4a-logs" (OuterVolumeSpecName: "logs") pod "15fbd312-35ac-4e62-ad60-ffccf94eab4a" (UID: "15fbd312-35ac-4e62-ad60-ffccf94eab4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.137215 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.146815 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-scripts" (OuterVolumeSpecName: "scripts") pod "15fbd312-35ac-4e62-ad60-ffccf94eab4a" (UID: "15fbd312-35ac-4e62-ad60-ffccf94eab4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.146902 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fbd312-35ac-4e62-ad60-ffccf94eab4a-kube-api-access-g4crl" (OuterVolumeSpecName: "kube-api-access-g4crl") pod "15fbd312-35ac-4e62-ad60-ffccf94eab4a" (UID: "15fbd312-35ac-4e62-ad60-ffccf94eab4a"). InnerVolumeSpecName "kube-api-access-g4crl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.146920 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" (UID: "a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: E0930 13:58:15.147019 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-certs podName:a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:15.64696852 +0000 UTC m=+1367.785528865 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "kube-state-metrics-tls-certs" (UniqueName: "kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-certs") pod "a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" (UID: "a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0") : error deleting /var/lib/kubelet/pods/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0/volume-subpaths: remove /var/lib/kubelet/pods/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0/volume-subpaths: no such file or directory Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.150723 4763 scope.go:117] "RemoveContainer" containerID="d9fa2c70ae01b905e89f072043e631965785e292d0fdd3d7833783fc483c2de5" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.156101 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.174485 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" (UID: "a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.176031 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.191512 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bb91013-85e0-4a13-9a06-0608b16a147b" (UID: "8bb91013-85e0-4a13-9a06-0608b16a147b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.199881 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.203097 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aea8c25c-f29f-49ba-ab27-87c8661479ab" (UID: "aea8c25c-f29f-49ba-ab27-87c8661479ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.224940 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.224976 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.224988 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.224999 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fbd312-35ac-4e62-ad60-ffccf94eab4a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.225015 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.225025 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4crl\" (UniqueName: \"kubernetes.io/projected/15fbd312-35ac-4e62-ad60-ffccf94eab4a-kube-api-access-g4crl\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.225037 4763 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.233198 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data" (OuterVolumeSpecName: "config-data") pod "aea8c25c-f29f-49ba-ab27-87c8661479ab" (UID: "aea8c25c-f29f-49ba-ab27-87c8661479ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: E0930 13:58:15.244893 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 30 13:58:15 crc kubenswrapper[4763]: E0930 13:58:15.277315 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.285475 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15fbd312-35ac-4e62-ad60-ffccf94eab4a" (UID: "15fbd312-35ac-4e62-ad60-ffccf94eab4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: E0930 13:58:15.288588 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 30 13:58:15 crc kubenswrapper[4763]: E0930 13:58:15.288857 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" containerName="ovn-northd" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.326862 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.326891 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea8c25c-f29f-49ba-ab27-87c8661479ab-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.341102 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-config-data" (OuterVolumeSpecName: "config-data") pod "15fbd312-35ac-4e62-ad60-ffccf94eab4a" (UID: "15fbd312-35ac-4e62-ad60-ffccf94eab4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.430528 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.435862 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data" (OuterVolumeSpecName: "config-data") pod "8bb91013-85e0-4a13-9a06-0608b16a147b" (UID: "8bb91013-85e0-4a13-9a06-0608b16a147b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.448118 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "15fbd312-35ac-4e62-ad60-ffccf94eab4a" (UID: "15fbd312-35ac-4e62-ad60-ffccf94eab4a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.459181 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "15fbd312-35ac-4e62-ad60-ffccf94eab4a" (UID: "15fbd312-35ac-4e62-ad60-ffccf94eab4a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.533295 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb91013-85e0-4a13-9a06-0608b16a147b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.533651 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.533699 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fbd312-35ac-4e62-ad60-ffccf94eab4a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.550679 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican41ee-account-delete-q7t27" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.566586 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder97e0-account-delete-988h9" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.569742 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibbaf-account-delete-rr8rm" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.577026 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.598948 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.608648 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0903f-account-delete-2hlbl" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.633698 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementb89d-account-delete-df5j7" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634372 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-combined-ca-bundle\") pod \"ce55d11a-887c-46e6-af05-90c3fca01e75\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634412 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-combined-ca-bundle\") pod \"ba159e27-7a3b-4b90-a7db-de6135f8153c\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634462 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-httpd-run\") pod \"ce55d11a-887c-46e6-af05-90c3fca01e75\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634488 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xshrq\" (UniqueName: \"kubernetes.io/projected/ba159e27-7a3b-4b90-a7db-de6135f8153c-kube-api-access-xshrq\") pod \"ba159e27-7a3b-4b90-a7db-de6135f8153c\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634521 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-config-data\") pod \"ce55d11a-887c-46e6-af05-90c3fca01e75\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634620 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-httpd-run\") pod \"ba159e27-7a3b-4b90-a7db-de6135f8153c\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634665 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-public-tls-certs\") pod \"ba159e27-7a3b-4b90-a7db-de6135f8153c\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634715 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-internal-tls-certs\") pod \"ce55d11a-887c-46e6-af05-90c3fca01e75\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634759 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmjcp\" (UniqueName: \"kubernetes.io/projected/4a588d68-fc19-4242-9b61-0ed79678fc9e-kube-api-access-bmjcp\") pod \"4a588d68-fc19-4242-9b61-0ed79678fc9e\" (UID: \"4a588d68-fc19-4242-9b61-0ed79678fc9e\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634790 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hbhc\" (UniqueName: \"kubernetes.io/projected/ae274648-abe2-416e-a43d-edc836edc424-kube-api-access-7hbhc\") pod \"ae274648-abe2-416e-a43d-edc836edc424\" (UID: \"ae274648-abe2-416e-a43d-edc836edc424\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634832 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-logs\") pod \"ce55d11a-887c-46e6-af05-90c3fca01e75\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634857 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ce55d11a-887c-46e6-af05-90c3fca01e75\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634923 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-scripts\") pod \"ce55d11a-887c-46e6-af05-90c3fca01e75\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634951 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhq6h\" (UniqueName: \"kubernetes.io/projected/7b9bf16b-039c-46ba-ae41-f0622530202d-kube-api-access-lhq6h\") pod \"7b9bf16b-039c-46ba-ae41-f0622530202d\" (UID: \"7b9bf16b-039c-46ba-ae41-f0622530202d\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634978 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzb6c\" (UniqueName: \"kubernetes.io/projected/0e45a139-0079-45cc-89a9-b1a0b0c1d179-kube-api-access-pzb6c\") pod \"0e45a139-0079-45cc-89a9-b1a0b0c1d179\" (UID: \"0e45a139-0079-45cc-89a9-b1a0b0c1d179\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.634998 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ba159e27-7a3b-4b90-a7db-de6135f8153c\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.635022 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-config-data\") pod \"ba159e27-7a3b-4b90-a7db-de6135f8153c\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.635048 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxsth\" (UniqueName: \"kubernetes.io/projected/ce55d11a-887c-46e6-af05-90c3fca01e75-kube-api-access-sxsth\") pod \"ce55d11a-887c-46e6-af05-90c3fca01e75\" (UID: \"ce55d11a-887c-46e6-af05-90c3fca01e75\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.635067 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-logs\") pod \"ba159e27-7a3b-4b90-a7db-de6135f8153c\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.635094 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-scripts\") pod \"ba159e27-7a3b-4b90-a7db-de6135f8153c\" (UID: \"ba159e27-7a3b-4b90-a7db-de6135f8153c\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.637039 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ba159e27-7a3b-4b90-a7db-de6135f8153c" (UID: "ba159e27-7a3b-4b90-a7db-de6135f8153c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.639059 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ce55d11a-887c-46e6-af05-90c3fca01e75" (UID: "ce55d11a-887c-46e6-af05-90c3fca01e75"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.642869 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-logs" (OuterVolumeSpecName: "logs") pod "ce55d11a-887c-46e6-af05-90c3fca01e75" (UID: "ce55d11a-887c-46e6-af05-90c3fca01e75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.644390 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a588d68-fc19-4242-9b61-0ed79678fc9e-kube-api-access-bmjcp" (OuterVolumeSpecName: "kube-api-access-bmjcp") pod "4a588d68-fc19-4242-9b61-0ed79678fc9e" (UID: "4a588d68-fc19-4242-9b61-0ed79678fc9e"). InnerVolumeSpecName "kube-api-access-bmjcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.644927 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-scripts" (OuterVolumeSpecName: "scripts") pod "ba159e27-7a3b-4b90-a7db-de6135f8153c" (UID: "ba159e27-7a3b-4b90-a7db-de6135f8153c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.645572 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-logs" (OuterVolumeSpecName: "logs") pod "ba159e27-7a3b-4b90-a7db-de6135f8153c" (UID: "ba159e27-7a3b-4b90-a7db-de6135f8153c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.648326 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.650785 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9bf16b-039c-46ba-ae41-f0622530202d-kube-api-access-lhq6h" (OuterVolumeSpecName: "kube-api-access-lhq6h") pod "7b9bf16b-039c-46ba-ae41-f0622530202d" (UID: "7b9bf16b-039c-46ba-ae41-f0622530202d"). InnerVolumeSpecName "kube-api-access-lhq6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.650913 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce55d11a-887c-46e6-af05-90c3fca01e75-kube-api-access-sxsth" (OuterVolumeSpecName: "kube-api-access-sxsth") pod "ce55d11a-887c-46e6-af05-90c3fca01e75" (UID: "ce55d11a-887c-46e6-af05-90c3fca01e75"). InnerVolumeSpecName "kube-api-access-sxsth". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.651799 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae274648-abe2-416e-a43d-edc836edc424-kube-api-access-7hbhc" (OuterVolumeSpecName: "kube-api-access-7hbhc") pod "ae274648-abe2-416e-a43d-edc836edc424" (UID: "ae274648-abe2-416e-a43d-edc836edc424"). InnerVolumeSpecName "kube-api-access-7hbhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.656329 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba159e27-7a3b-4b90-a7db-de6135f8153c-kube-api-access-xshrq" (OuterVolumeSpecName: "kube-api-access-xshrq") pod "ba159e27-7a3b-4b90-a7db-de6135f8153c" (UID: "ba159e27-7a3b-4b90-a7db-de6135f8153c"). InnerVolumeSpecName "kube-api-access-xshrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.666062 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "ba159e27-7a3b-4b90-a7db-de6135f8153c" (UID: "ba159e27-7a3b-4b90-a7db-de6135f8153c"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.666205 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-scripts" (OuterVolumeSpecName: "scripts") pod "ce55d11a-887c-46e6-af05-90c3fca01e75" (UID: "ce55d11a-887c-46e6-af05-90c3fca01e75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.666222 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e45a139-0079-45cc-89a9-b1a0b0c1d179-kube-api-access-pzb6c" (OuterVolumeSpecName: "kube-api-access-pzb6c") pod "0e45a139-0079-45cc-89a9-b1a0b0c1d179" (UID: "0e45a139-0079-45cc-89a9-b1a0b0c1d179"). InnerVolumeSpecName "kube-api-access-pzb6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.705884 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "ce55d11a-887c-46e6-af05-90c3fca01e75" (UID: "ce55d11a-887c-46e6-af05-90c3fca01e75"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.708776 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba159e27-7a3b-4b90-a7db-de6135f8153c" (UID: "ba159e27-7a3b-4b90-a7db-de6135f8153c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.718273 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce55d11a-887c-46e6-af05-90c3fca01e75" (UID: "ce55d11a-887c-46e6-af05-90c3fca01e75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.719254 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-config-data" (OuterVolumeSpecName: "config-data") pod "ce55d11a-887c-46e6-af05-90c3fca01e75" (UID: "ce55d11a-887c-46e6-af05-90c3fca01e75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.725269 4763 generic.go:334] "Generic (PLEG): container finished" podID="ce55d11a-887c-46e6-af05-90c3fca01e75" containerID="26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a" exitCode=0 Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.725443 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce55d11a-887c-46e6-af05-90c3fca01e75","Type":"ContainerDied","Data":"26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.725533 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce55d11a-887c-46e6-af05-90c3fca01e75","Type":"ContainerDied","Data":"72fe39f005d969bb9005ec3d228a785f291680c2e081a734c563831b7f077f26"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.725612 4763 scope.go:117] "RemoveContainer" containerID="26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.725766 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.731586 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0","Type":"ContainerDied","Data":"a1e33831828add2c6d80f40a159bc893182b58884a01730e224a39208c1ddd8b"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.731794 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.736114 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data-custom\") pod \"b321cfd6-9039-4fe6-a39c-619f101d5e30\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.736404 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-internal-tls-certs\") pod \"b321cfd6-9039-4fe6-a39c-619f101d5e30\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.736447 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b321cfd6-9039-4fe6-a39c-619f101d5e30-logs\") pod \"b321cfd6-9039-4fe6-a39c-619f101d5e30\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.736512 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data\") pod \"b321cfd6-9039-4fe6-a39c-619f101d5e30\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.736546 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4pwj\" (UniqueName: \"kubernetes.io/projected/7550cde2-d6ca-4dc1-8772-5eea0a9b8142-kube-api-access-v4pwj\") pod \"7550cde2-d6ca-4dc1-8772-5eea0a9b8142\" (UID: \"7550cde2-d6ca-4dc1-8772-5eea0a9b8142\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.736836 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-combined-ca-bundle\") pod \"b321cfd6-9039-4fe6-a39c-619f101d5e30\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.736976 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-scripts\") pod \"b321cfd6-9039-4fe6-a39c-619f101d5e30\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737131 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-certs\") pod \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\" (UID: \"a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737164 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b321cfd6-9039-4fe6-a39c-619f101d5e30-etc-machine-id\") pod \"b321cfd6-9039-4fe6-a39c-619f101d5e30\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737206 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-public-tls-certs\") pod \"b321cfd6-9039-4fe6-a39c-619f101d5e30\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737240 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj7kl\" (UniqueName: \"kubernetes.io/projected/b321cfd6-9039-4fe6-a39c-619f101d5e30-kube-api-access-dj7kl\") pod \"b321cfd6-9039-4fe6-a39c-619f101d5e30\" (UID: \"b321cfd6-9039-4fe6-a39c-619f101d5e30\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737805 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737905 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737921 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737933 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737947 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xshrq\" (UniqueName: \"kubernetes.io/projected/ba159e27-7a3b-4b90-a7db-de6135f8153c-kube-api-access-xshrq\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737959 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737970 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737983 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmjcp\" (UniqueName: \"kubernetes.io/projected/4a588d68-fc19-4242-9b61-0ed79678fc9e-kube-api-access-bmjcp\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.737996 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hbhc\" (UniqueName: \"kubernetes.io/projected/ae274648-abe2-416e-a43d-edc836edc424-kube-api-access-7hbhc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.738009 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce55d11a-887c-46e6-af05-90c3fca01e75-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.738031 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.738064 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.738131 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhq6h\" (UniqueName: \"kubernetes.io/projected/7b9bf16b-039c-46ba-ae41-f0622530202d-kube-api-access-lhq6h\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.738145 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzb6c\" (UniqueName: \"kubernetes.io/projected/0e45a139-0079-45cc-89a9-b1a0b0c1d179-kube-api-access-pzb6c\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.738164 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.738176 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxsth\" (UniqueName: \"kubernetes.io/projected/ce55d11a-887c-46e6-af05-90c3fca01e75-kube-api-access-sxsth\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.738186 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba159e27-7a3b-4b90-a7db-de6135f8153c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.753821 4763 generic.go:334] "Generic (PLEG): container finished" podID="e5f7940e-dedf-45a0-97b4-dc825dc00fc5" containerID="8d9e3ab86dc859f16e88025097f97a98ba29d69374fe2446837e45205f560afd" exitCode=0 Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.753907 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e5f7940e-dedf-45a0-97b4-dc825dc00fc5","Type":"ContainerDied","Data":"8d9e3ab86dc859f16e88025097f97a98ba29d69374fe2446837e45205f560afd"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.753930 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e5f7940e-dedf-45a0-97b4-dc825dc00fc5","Type":"ContainerDied","Data":"d5bdae3e0963d01b6c4265fdebe04dcdd44b7df16dbb249020ed0328adfb9388"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.753941 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5bdae3e0963d01b6c4265fdebe04dcdd44b7df16dbb249020ed0328adfb9388" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.761725 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b321cfd6-9039-4fe6-a39c-619f101d5e30-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b321cfd6-9039-4fe6-a39c-619f101d5e30" (UID: "b321cfd6-9039-4fe6-a39c-619f101d5e30"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.765052 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.766778 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b87bfdd4b-tbjxc" event={"ID":"15fbd312-35ac-4e62-ad60-ffccf94eab4a","Type":"ContainerDied","Data":"406b783e23b8c4c640b95b39ff2f63415a30004ec013d7d1ccc9d85eec9a71a8"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.766864 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b87bfdd4b-tbjxc" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.769957 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.774547 4763 generic.go:334] "Generic (PLEG): container finished" podID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerID="4050fb5fd9697e750ea813cde28a4d185f69fe05aa260d270466a73fd43cd815" exitCode=0 Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.774614 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dafb3edf-a4c0-4131-ad09-f836de63ff6b","Type":"ContainerDied","Data":"4050fb5fd9697e750ea813cde28a4d185f69fe05aa260d270466a73fd43cd815"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.780994 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b321cfd6-9039-4fe6-a39c-619f101d5e30-logs" (OuterVolumeSpecName: "logs") pod "b321cfd6-9039-4fe6-a39c-619f101d5e30" (UID: "b321cfd6-9039-4fe6-a39c-619f101d5e30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.783410 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b321cfd6-9039-4fe6-a39c-619f101d5e30-kube-api-access-dj7kl" (OuterVolumeSpecName: "kube-api-access-dj7kl") pod "b321cfd6-9039-4fe6-a39c-619f101d5e30" (UID: "b321cfd6-9039-4fe6-a39c-619f101d5e30"). InnerVolumeSpecName "kube-api-access-dj7kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.783439 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b321cfd6-9039-4fe6-a39c-619f101d5e30" (UID: "b321cfd6-9039-4fe6-a39c-619f101d5e30"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.784689 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" (UID: "a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.788721 4763 generic.go:334] "Generic (PLEG): container finished" podID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerID="59b75d8a10fde456e075a28d38cdb8ef12838b4b2acfbdbbde03b04350659d72" exitCode=0 Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.788828 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67bf5b69fb-ff2xw" event={"ID":"5ed0d19e-bbae-437d-9083-cded205c65f6","Type":"ContainerDied","Data":"59b75d8a10fde456e075a28d38cdb8ef12838b4b2acfbdbbde03b04350659d72"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.790686 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7550cde2-d6ca-4dc1-8772-5eea0a9b8142-kube-api-access-v4pwj" (OuterVolumeSpecName: "kube-api-access-v4pwj") pod "7550cde2-d6ca-4dc1-8772-5eea0a9b8142" (UID: "7550cde2-d6ca-4dc1-8772-5eea0a9b8142"). InnerVolumeSpecName "kube-api-access-v4pwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.799370 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.803024 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ce55d11a-887c-46e6-af05-90c3fca01e75" (UID: "ce55d11a-887c-46e6-af05-90c3fca01e75"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.805270 4763 scope.go:117] "RemoveContainer" containerID="3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.818880 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-scripts" (OuterVolumeSpecName: "scripts") pod "b321cfd6-9039-4fe6-a39c-619f101d5e30" (UID: "b321cfd6-9039-4fe6-a39c-619f101d5e30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.819114 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder97e0-account-delete-988h9" event={"ID":"ae274648-abe2-416e-a43d-edc836edc424","Type":"ContainerDied","Data":"622d618778d05bbbdd744c9c116177e9d0287d466cbd347edf7f53b1fe93641e"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.819216 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder97e0-account-delete-988h9" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.827227 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b321cfd6-9039-4fe6-a39c-619f101d5e30" (UID: "b321cfd6-9039-4fe6-a39c-619f101d5e30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.839916 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-combined-ca-bundle\") pod \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.840197 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-memcached-tls-certs\") pod \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.840284 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft9rn\" (UniqueName: \"kubernetes.io/projected/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kube-api-access-ft9rn\") pod \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.841002 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-config-data\") pod \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.841121 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kolla-config\") pod \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\" (UID: \"e5f7940e-dedf-45a0-97b4-dc825dc00fc5\") " Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.841662 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e5f7940e-dedf-45a0-97b4-dc825dc00fc5" (UID: "e5f7940e-dedf-45a0-97b4-dc825dc00fc5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.841766 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-config-data" (OuterVolumeSpecName: "config-data") pod "e5f7940e-dedf-45a0-97b4-dc825dc00fc5" (UID: "e5f7940e-dedf-45a0-97b4-dc825dc00fc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.842640 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ba159e27-7a3b-4b90-a7db-de6135f8153c" (UID: "ba159e27-7a3b-4b90-a7db-de6135f8153c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843363 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj7kl\" (UniqueName: \"kubernetes.io/projected/b321cfd6-9039-4fe6-a39c-619f101d5e30-kube-api-access-dj7kl\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843389 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce55d11a-887c-46e6-af05-90c3fca01e75-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843399 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843409 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843418 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b321cfd6-9039-4fe6-a39c-619f101d5e30-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843427 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4pwj\" (UniqueName: \"kubernetes.io/projected/7550cde2-d6ca-4dc1-8772-5eea0a9b8142-kube-api-access-v4pwj\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843439 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843447 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843455 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843464 4763 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843472 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b321cfd6-9039-4fe6-a39c-619f101d5e30-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843480 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.843488 4763 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kolla-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.850910 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kube-api-access-ft9rn" (OuterVolumeSpecName: "kube-api-access-ft9rn") pod "e5f7940e-dedf-45a0-97b4-dc825dc00fc5" (UID: "e5f7940e-dedf-45a0-97b4-dc825dc00fc5"). InnerVolumeSpecName "kube-api-access-ft9rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.851809 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba159e27-7a3b-4b90-a7db-de6135f8153c","Type":"ContainerDied","Data":"5e9262290f06c0b049b8c4a763dd2786af46c0daf94e2c034bd9a8071f39cfc3"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.851908 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.853012 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5b87bfdd4b-tbjxc"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.862942 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5b87bfdd4b-tbjxc"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.863935 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican41ee-account-delete-q7t27" event={"ID":"4a588d68-fc19-4242-9b61-0ed79678fc9e","Type":"ContainerDied","Data":"5f8b5de9d649736ce71949b48e43c1af95542cb542439785a04e6c37d0a3441f"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.863998 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican41ee-account-delete-q7t27" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.869241 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder97e0-account-delete-988h9"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.869689 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-config-data" (OuterVolumeSpecName: "config-data") pod "ba159e27-7a3b-4b90-a7db-de6135f8153c" (UID: "ba159e27-7a3b-4b90-a7db-de6135f8153c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.870173 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementb89d-account-delete-df5j7" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.870213 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementb89d-account-delete-df5j7" event={"ID":"7550cde2-d6ca-4dc1-8772-5eea0a9b8142","Type":"ContainerDied","Data":"f3e40e4f3fde93bd5b0aacedcf9067d957fd5c3173624f059fa6c2e124e785aa"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.870253 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e40e4f3fde93bd5b0aacedcf9067d957fd5c3173624f059fa6c2e124e785aa" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.880880 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data" (OuterVolumeSpecName: "config-data") pod "b321cfd6-9039-4fe6-a39c-619f101d5e30" (UID: "b321cfd6-9039-4fe6-a39c-619f101d5e30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.881310 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibbaf-account-delete-rr8rm" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.881372 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibbaf-account-delete-rr8rm" event={"ID":"0e45a139-0079-45cc-89a9-b1a0b0c1d179","Type":"ContainerDied","Data":"253b2e156d518de184dbf86f2fdf0886c5ebe8d405e87877ee3f11ae8e20b2c8"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.882040 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.883914 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder97e0-account-delete-988h9"] Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.886643 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.887005 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.887199 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bb91013-85e0-4a13-9a06-0608b16a147b","Type":"ContainerDied","Data":"26c6f327823006b666d992af446a7422c031f313b3bfcbafd3c3d69d64cfc3cd"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.890880 4763 generic.go:334] "Generic (PLEG): container finished" podID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerID="15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7" exitCode=0 Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.890942 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6002d74d-668d-4f30-b13a-c87ec6a8a3b8","Type":"ContainerDied","Data":"15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.890984 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.891891 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b321cfd6-9039-4fe6-a39c-619f101d5e30" (UID: "b321cfd6-9039-4fe6-a39c-619f101d5e30"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.892775 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5f7940e-dedf-45a0-97b4-dc825dc00fc5" (UID: "e5f7940e-dedf-45a0-97b4-dc825dc00fc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.902099 4763 generic.go:334] "Generic (PLEG): container finished" podID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerID="9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5" exitCode=0 Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.902185 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"709e8d49-783d-44fb-8bcb-0b4ac2199efe","Type":"ContainerDied","Data":"9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5"} Sep 30 13:58:15 crc kubenswrapper[4763]: I0930 13:58:15.902274 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.912531 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0903f-account-delete-2hlbl" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.912532 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0903f-account-delete-2hlbl" event={"ID":"7b9bf16b-039c-46ba-ae41-f0622530202d","Type":"ContainerDied","Data":"0f14aade8b4af8f3bd745e51714b8a0fae5fba3e4e8568965f4260f4ccc70e29"} Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.917088 4763 generic.go:334] "Generic (PLEG): container finished" podID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerID="ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc" exitCode=0 Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.917145 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b321cfd6-9039-4fe6-a39c-619f101d5e30","Type":"ContainerDied","Data":"ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc"} Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.917183 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b321cfd6-9039-4fe6-a39c-619f101d5e30","Type":"ContainerDied","Data":"7ed00235354744d2bb87a6a58240e6d5dd0fcab0d185bf8a571d80442c36ebbf"} Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.917191 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.917208 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-95cdd9cf8-gbh25" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.917268 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f884f68c5-j4x5x" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.933577 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "e5f7940e-dedf-45a0-97b4-dc825dc00fc5" (UID: "e5f7940e-dedf-45a0-97b4-dc825dc00fc5"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944130 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx728\" (UniqueName: \"kubernetes.io/projected/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-kube-api-access-lx728\") pod \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944217 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-internal-tls-certs\") pod \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944262 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-config-data\") pod \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944278 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-combined-ca-bundle\") pod \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944293 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-config-data\") pod \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944323 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-nova-metadata-tls-certs\") pod \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944380 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-combined-ca-bundle\") pod \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944405 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/709e8d49-783d-44fb-8bcb-0b4ac2199efe-logs\") pod \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944445 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-public-tls-certs\") pod \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944481 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2dnr\" (UniqueName: \"kubernetes.io/projected/709e8d49-783d-44fb-8bcb-0b4ac2199efe-kube-api-access-c2dnr\") pod \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\" (UID: \"709e8d49-783d-44fb-8bcb-0b4ac2199efe\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944516 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-logs\") pod \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\" (UID: \"6002d74d-668d-4f30-b13a-c87ec6a8a3b8\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944949 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944967 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944979 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.944990 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.945002 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba159e27-7a3b-4b90-a7db-de6135f8153c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.945015 4763 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.945026 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft9rn\" (UniqueName: \"kubernetes.io/projected/e5f7940e-dedf-45a0-97b4-dc825dc00fc5-kube-api-access-ft9rn\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.945031 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/709e8d49-783d-44fb-8bcb-0b4ac2199efe-logs" (OuterVolumeSpecName: "logs") pod "709e8d49-783d-44fb-8bcb-0b4ac2199efe" (UID: "709e8d49-783d-44fb-8bcb-0b4ac2199efe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.946378 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-logs" (OuterVolumeSpecName: "logs") pod "6002d74d-668d-4f30-b13a-c87ec6a8a3b8" (UID: "6002d74d-668d-4f30-b13a-c87ec6a8a3b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.948078 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b321cfd6-9039-4fe6-a39c-619f101d5e30" (UID: "b321cfd6-9039-4fe6-a39c-619f101d5e30"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.948136 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-kube-api-access-lx728" (OuterVolumeSpecName: "kube-api-access-lx728") pod "6002d74d-668d-4f30-b13a-c87ec6a8a3b8" (UID: "6002d74d-668d-4f30-b13a-c87ec6a8a3b8"). InnerVolumeSpecName "kube-api-access-lx728". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.951425 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709e8d49-783d-44fb-8bcb-0b4ac2199efe-kube-api-access-c2dnr" (OuterVolumeSpecName: "kube-api-access-c2dnr") pod "709e8d49-783d-44fb-8bcb-0b4ac2199efe" (UID: "709e8d49-783d-44fb-8bcb-0b4ac2199efe"). InnerVolumeSpecName "kube-api-access-c2dnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.964795 4763 scope.go:117] "RemoveContainer" containerID="26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a" Sep 30 13:58:16 crc kubenswrapper[4763]: E0930 13:58:15.965302 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a\": container with ID starting with 26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a not found: ID does not exist" containerID="26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.965347 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a"} err="failed to get container status \"26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a\": rpc error: code = NotFound desc = could not find container \"26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a\": container with ID starting with 26b7f806154c4c28cb2af7f5a74d915ee35c879804f0f217341cb8e0d581684a not found: ID does not exist" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.965381 4763 scope.go:117] "RemoveContainer" containerID="3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d" Sep 30 13:58:16 crc kubenswrapper[4763]: E0930 13:58:15.965844 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d\": container with ID starting with 3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d not found: ID does not exist" containerID="3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.965878 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d"} err="failed to get container status \"3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d\": rpc error: code = NotFound desc = could not find container \"3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d\": container with ID starting with 3149c070e258ceeb16106de761a1c9ec5c3916568baf33a404009f400ca8176d not found: ID does not exist" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.965906 4763 scope.go:117] "RemoveContainer" containerID="a9c41d23775fc1eb5013bbc346ef1bd994014c9076f54a0e50116c9cc0474cc7" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.972712 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "709e8d49-783d-44fb-8bcb-0b4ac2199efe" (UID: "709e8d49-783d-44fb-8bcb-0b4ac2199efe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.973740 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6002d74d-668d-4f30-b13a-c87ec6a8a3b8" (UID: "6002d74d-668d-4f30-b13a-c87ec6a8a3b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.981806 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-config-data" (OuterVolumeSpecName: "config-data") pod "6002d74d-668d-4f30-b13a-c87ec6a8a3b8" (UID: "6002d74d-668d-4f30-b13a-c87ec6a8a3b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:15.987848 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-config-data" (OuterVolumeSpecName: "config-data") pod "709e8d49-783d-44fb-8bcb-0b4ac2199efe" (UID: "709e8d49-783d-44fb-8bcb-0b4ac2199efe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.017581 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6002d74d-668d-4f30-b13a-c87ec6a8a3b8" (UID: "6002d74d-668d-4f30-b13a-c87ec6a8a3b8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.023885 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6002d74d-668d-4f30-b13a-c87ec6a8a3b8" (UID: "6002d74d-668d-4f30-b13a-c87ec6a8a3b8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.025498 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementb89d-account-delete-df5j7"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.031820 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046249 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046281 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046294 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046304 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046315 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/709e8d49-783d-44fb-8bcb-0b4ac2199efe-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046325 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046335 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2dnr\" (UniqueName: \"kubernetes.io/projected/709e8d49-783d-44fb-8bcb-0b4ac2199efe-kube-api-access-c2dnr\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046347 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046357 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx728\" (UniqueName: \"kubernetes.io/projected/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-kube-api-access-lx728\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046369 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b321cfd6-9039-4fe6-a39c-619f101d5e30-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.046379 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002d74d-668d-4f30-b13a-c87ec6a8a3b8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.053302 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementb89d-account-delete-df5j7"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.068865 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican41ee-account-delete-q7t27"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.077642 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican41ee-account-delete-q7t27"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.086436 4763 scope.go:117] "RemoveContainer" containerID="6e00eb474337eb85a3ae6ce678a0a8afddc2bad42ef7bdbf41de0b427ce3b086" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.102508 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "709e8d49-783d-44fb-8bcb-0b4ac2199efe" (UID: "709e8d49-783d-44fb-8bcb-0b4ac2199efe"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.147255 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed0d19e-bbae-437d-9083-cded205c65f6-logs\") pod \"5ed0d19e-bbae-437d-9083-cded205c65f6\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.147330 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data-custom\") pod \"5ed0d19e-bbae-437d-9083-cded205c65f6\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.147355 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data\") pod \"5ed0d19e-bbae-437d-9083-cded205c65f6\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.147481 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-internal-tls-certs\") pod \"5ed0d19e-bbae-437d-9083-cded205c65f6\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.147529 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5llr\" (UniqueName: \"kubernetes.io/projected/5ed0d19e-bbae-437d-9083-cded205c65f6-kube-api-access-b5llr\") pod \"5ed0d19e-bbae-437d-9083-cded205c65f6\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.147549 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-combined-ca-bundle\") pod \"5ed0d19e-bbae-437d-9083-cded205c65f6\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.147583 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-public-tls-certs\") pod \"5ed0d19e-bbae-437d-9083-cded205c65f6\" (UID: \"5ed0d19e-bbae-437d-9083-cded205c65f6\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.147942 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/709e8d49-783d-44fb-8bcb-0b4ac2199efe-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.154525 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapibbaf-account-delete-rr8rm"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.156933 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed0d19e-bbae-437d-9083-cded205c65f6-logs" (OuterVolumeSpecName: "logs") pod "5ed0d19e-bbae-437d-9083-cded205c65f6" (UID: "5ed0d19e-bbae-437d-9083-cded205c65f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.160382 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5ed0d19e-bbae-437d-9083-cded205c65f6" (UID: "5ed0d19e-bbae-437d-9083-cded205c65f6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.161817 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapibbaf-account-delete-rr8rm"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.180271 4763 scope.go:117] "RemoveContainer" containerID="0c4968490ce8a08e1aec5c2072537900212ab6566f70cf01898816dd71f1b15c" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.180338 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed0d19e-bbae-437d-9083-cded205c65f6-kube-api-access-b5llr" (OuterVolumeSpecName: "kube-api-access-b5llr") pod "5ed0d19e-bbae-437d-9083-cded205c65f6" (UID: "5ed0d19e-bbae-437d-9083-cded205c65f6"). InnerVolumeSpecName "kube-api-access-b5llr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.182589 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ed0d19e-bbae-437d-9083-cded205c65f6" (UID: "5ed0d19e-bbae-437d-9083-cded205c65f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.204180 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data" (OuterVolumeSpecName: "config-data") pod "5ed0d19e-bbae-437d-9083-cded205c65f6" (UID: "5ed0d19e-bbae-437d-9083-cded205c65f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.229502 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5ed0d19e-bbae-437d-9083-cded205c65f6" (UID: "5ed0d19e-bbae-437d-9083-cded205c65f6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.250057 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.250093 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed0d19e-bbae-437d-9083-cded205c65f6-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.250105 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.250119 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.250132 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.250144 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5llr\" (UniqueName: \"kubernetes.io/projected/5ed0d19e-bbae-437d-9083-cded205c65f6-kube-api-access-b5llr\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.270002 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5ed0d19e-bbae-437d-9083-cded205c65f6" (UID: "5ed0d19e-bbae-437d-9083-cded205c65f6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.329047 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3119638a-6580-4a24-8e7f-40f7f7d788a5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.351637 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed0d19e-bbae-437d-9083-cded205c65f6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.502444 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e45a139-0079-45cc-89a9-b1a0b0c1d179" path="/var/lib/kubelet/pods/0e45a139-0079-45cc-89a9-b1a0b0c1d179/volumes" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.504073 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" path="/var/lib/kubelet/pods/15fbd312-35ac-4e62-ad60-ffccf94eab4a/volumes" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.505760 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db73295-0655-443c-91e0-2cd08b119141" path="/var/lib/kubelet/pods/1db73295-0655-443c-91e0-2cd08b119141/volumes" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.506798 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a588d68-fc19-4242-9b61-0ed79678fc9e" path="/var/lib/kubelet/pods/4a588d68-fc19-4242-9b61-0ed79678fc9e/volumes" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.507455 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7550cde2-d6ca-4dc1-8772-5eea0a9b8142" path="/var/lib/kubelet/pods/7550cde2-d6ca-4dc1-8772-5eea0a9b8142/volumes" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.508687 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783d0307-40e6-4d1e-9728-b1fe356e6b52" path="/var/lib/kubelet/pods/783d0307-40e6-4d1e-9728-b1fe356e6b52/volumes" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.509666 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" path="/var/lib/kubelet/pods/a99f7915-f0b7-498a-941d-b02d87df4b98/volumes" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.510473 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae274648-abe2-416e-a43d-edc836edc424" path="/var/lib/kubelet/pods/ae274648-abe2-416e-a43d-edc836edc424/volumes" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.512145 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c5264e-b119-4444-b954-c33b428294b5" path="/var/lib/kubelet/pods/e2c5264e-b119-4444-b954-c33b428294b5/volumes" Sep 30 13:58:16 crc kubenswrapper[4763]: E0930 13:58:16.560665 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Sep 30 13:58:16 crc kubenswrapper[4763]: E0930 13:58:16.561257 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data podName:aebd5213-18eb-4d84-b39e-fd22f9ff9a6c nodeName:}" failed. No retries permitted until 2025-09-30 13:58:24.561209473 +0000 UTC m=+1376.699769758 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data") pod "rabbitmq-cell1-server-0" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c") : configmap "rabbitmq-cell1-config-data" not found Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.734056 4763 scope.go:117] "RemoveContainer" containerID="001400b738e180e3a088717caa1a086bcf8cc3e72a547b2cbd98bcaa871db31d" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.734291 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0903f-account-delete-2hlbl"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.743830 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0903f-account-delete-2hlbl"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.752397 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-95cdd9cf8-gbh25"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.758110 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-95cdd9cf8-gbh25"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.770503 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.784758 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_916727b2-6488-4edf-b33b-c5908eae0e41/ovn-northd/0.log" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.784832 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.785944 4763 scope.go:117] "RemoveContainer" containerID="cb40ecb42c9c7e13873d46e9437c88735cec11180b4edf02393f88c403b8189b" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.825998 4763 scope.go:117] "RemoveContainer" containerID="1b8f0928a35a4ae56d6ef0cb85281920dd3c4a313f493d020961d40d139fa47b" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.857952 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.865924 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-tls\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.865974 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-confd\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866008 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-northd-tls-certs\") pod \"916727b2-6488-4edf-b33b-c5908eae0e41\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866047 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866068 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-plugins-conf\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866100 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866118 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-plugins\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866153 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-scripts\") pod \"916727b2-6488-4edf-b33b-c5908eae0e41\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866177 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7qll\" (UniqueName: \"kubernetes.io/projected/916727b2-6488-4edf-b33b-c5908eae0e41-kube-api-access-s7qll\") pod \"916727b2-6488-4edf-b33b-c5908eae0e41\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866196 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-config\") pod \"916727b2-6488-4edf-b33b-c5908eae0e41\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866229 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-pod-info\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866248 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzs9q\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-kube-api-access-pzs9q\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866278 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-erlang-cookie-secret\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866297 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-metrics-certs-tls-certs\") pod \"916727b2-6488-4edf-b33b-c5908eae0e41\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866317 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-rundir\") pod \"916727b2-6488-4edf-b33b-c5908eae0e41\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866333 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-erlang-cookie\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866371 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-combined-ca-bundle\") pod \"916727b2-6488-4edf-b33b-c5908eae0e41\" (UID: \"916727b2-6488-4edf-b33b-c5908eae0e41\") " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.866395 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-server-conf\") pod \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\" (UID: \"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c\") " Sep 30 13:58:16 crc kubenswrapper[4763]: E0930 13:58:16.866755 4763 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Sep 30 13:58:16 crc kubenswrapper[4763]: E0930 13:58:16.866801 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data podName:3119638a-6580-4a24-8e7f-40f7f7d788a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:24.866786783 +0000 UTC m=+1377.005347068 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data") pod "rabbitmq-server-0" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5") : configmap "rabbitmq-config-data" not found Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.869468 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-config" (OuterVolumeSpecName: "config") pod "916727b2-6488-4edf-b33b-c5908eae0e41" (UID: "916727b2-6488-4edf-b33b-c5908eae0e41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.874984 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "916727b2-6488-4edf-b33b-c5908eae0e41" (UID: "916727b2-6488-4edf-b33b-c5908eae0e41"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.876115 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-scripts" (OuterVolumeSpecName: "scripts") pod "916727b2-6488-4edf-b33b-c5908eae0e41" (UID: "916727b2-6488-4edf-b33b-c5908eae0e41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.876522 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.882411 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.887118 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.887415 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-pod-info" (OuterVolumeSpecName: "pod-info") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.887985 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.894219 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-kube-api-access-pzs9q" (OuterVolumeSpecName: "kube-api-access-pzs9q") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "kube-api-access-pzs9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.894501 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.894711 4763 scope.go:117] "RemoveContainer" containerID="9355d58d746614fcc1470f5ee2a3cbecebd32880006f604b38e2e1f09b1a4a21" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.901392 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.905951 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.906127 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.909208 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.913776 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/916727b2-6488-4edf-b33b-c5908eae0e41-kube-api-access-s7qll" (OuterVolumeSpecName: "kube-api-access-s7qll") pod "916727b2-6488-4edf-b33b-c5908eae0e41" (UID: "916727b2-6488-4edf-b33b-c5908eae0e41"). InnerVolumeSpecName "kube-api-access-s7qll". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.914685 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.920158 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.925759 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.932481 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.933313 4763 scope.go:117] "RemoveContainer" containerID="bd91d5f1697ca42a565bdfcc3ac0d792de9c64aaa05b55fe8b3688f3e91d36cd" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.938715 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.965664 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.970473 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data" (OuterVolumeSpecName: "config-data") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971863 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971895 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971906 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971915 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971924 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971933 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7qll\" (UniqueName: \"kubernetes.io/projected/916727b2-6488-4edf-b33b-c5908eae0e41-kube-api-access-s7qll\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971941 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916727b2-6488-4edf-b33b-c5908eae0e41-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971955 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971964 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzs9q\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-kube-api-access-pzs9q\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971972 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971979 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-rundir\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.971991 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.972001 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.980619 4763 scope.go:117] "RemoveContainer" containerID="e1f5778fa17d7cddfefc9145f7fd206fb41d3fa0e3cf06f8bd8e12eb8a451d1b" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.983149 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.984778 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "916727b2-6488-4edf-b33b-c5908eae0e41" (UID: "916727b2-6488-4edf-b33b-c5908eae0e41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.990009 4763 generic.go:334] "Generic (PLEG): container finished" podID="aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" containerID="28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9" exitCode=0 Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.990065 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c","Type":"ContainerDied","Data":"28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9"} Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.990090 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aebd5213-18eb-4d84-b39e-fd22f9ff9a6c","Type":"ContainerDied","Data":"39be8740061f3f51aef9766adac9cb7379938198ef064858f5d80903a2095993"} Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.990152 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.990823 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.997018 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67bf5b69fb-ff2xw" Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.997009 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67bf5b69fb-ff2xw" event={"ID":"5ed0d19e-bbae-437d-9083-cded205c65f6","Type":"ContainerDied","Data":"b1babd7d8f632f71496cd71e56038f714301a0fbc190f6fa9720ac20a4e827ce"} Sep 30 13:58:16 crc kubenswrapper[4763]: I0930 13:58:16.997415 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.002710 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.007484 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.012411 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "916727b2-6488-4edf-b33b-c5908eae0e41" (UID: "916727b2-6488-4edf-b33b-c5908eae0e41"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.013188 4763 scope.go:117] "RemoveContainer" containerID="cba654c3201589fdfeff007899f0f73ab1e63e34fc4ccb4f54ba1464e5755d0a" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.018105 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_916727b2-6488-4edf-b33b-c5908eae0e41/ovn-northd/0.log" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.018165 4763 generic.go:334] "Generic (PLEG): container finished" podID="916727b2-6488-4edf-b33b-c5908eae0e41" containerID="e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" exitCode=139 Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.018266 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"916727b2-6488-4edf-b33b-c5908eae0e41","Type":"ContainerDied","Data":"e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737"} Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.018314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"916727b2-6488-4edf-b33b-c5908eae0e41","Type":"ContainerDied","Data":"ff96488141df2f82b229455031c651a019c33c792c9a917bfaa2b10e83bc49b3"} Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.018874 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.026966 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.029560 4763 generic.go:334] "Generic (PLEG): container finished" podID="3119638a-6580-4a24-8e7f-40f7f7d788a5" containerID="c5af37dfd26586dfbe5d5f60114f298ea522d4e3bbbc87c8e965efa23a5cf953" exitCode=0 Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.029623 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3119638a-6580-4a24-8e7f-40f7f7d788a5","Type":"ContainerDied","Data":"c5af37dfd26586dfbe5d5f60114f298ea522d4e3bbbc87c8e965efa23a5cf953"} Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.031139 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5f884f68c5-j4x5x"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.035206 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.053075 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "916727b2-6488-4edf-b33b-c5908eae0e41" (UID: "916727b2-6488-4edf-b33b-c5908eae0e41"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.053362 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5f884f68c5-j4x5x"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.061469 4763 scope.go:117] "RemoveContainer" containerID="15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.064331 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.073531 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.073563 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.073576 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.073586 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/916727b2-6488-4edf-b33b-c5908eae0e41-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.073595 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.082221 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-67bf5b69fb-ff2xw"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.084902 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-server-conf" (OuterVolumeSpecName: "server-conf") pod "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" (UID: "aebd5213-18eb-4d84-b39e-fd22f9ff9a6c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.098921 4763 scope.go:117] "RemoveContainer" containerID="963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.107064 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-67bf5b69fb-ff2xw"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.114963 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.119990 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.125335 4763 scope.go:117] "RemoveContainer" containerID="9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.141433 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.143285 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.151830 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.151937 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="aefbc43e-494e-48a6-963c-7be9d0159387" containerName="nova-cell1-conductor-conductor" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.161375 4763 scope.go:117] "RemoveContainer" containerID="6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.174999 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3119638a-6580-4a24-8e7f-40f7f7d788a5-pod-info\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175063 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175088 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3119638a-6580-4a24-8e7f-40f7f7d788a5-erlang-cookie-secret\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175161 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-tls\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175220 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-erlang-cookie\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175256 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-server-conf\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175346 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-plugins-conf\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175436 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5kwk\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-kube-api-access-f5kwk\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175459 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-plugins\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175495 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-confd\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175515 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"3119638a-6580-4a24-8e7f-40f7f7d788a5\" (UID: \"3119638a-6580-4a24-8e7f-40f7f7d788a5\") " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.175828 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.176707 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.179745 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.180151 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.181115 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-kube-api-access-f5kwk" (OuterVolumeSpecName: "kube-api-access-f5kwk") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "kube-api-access-f5kwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.181648 4763 scope.go:117] "RemoveContainer" containerID="f04f15127c26fdd23725e119940adc91f2443e20537485e0fcc64c95fdba4519" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.181762 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.181837 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3119638a-6580-4a24-8e7f-40f7f7d788a5-pod-info" (OuterVolumeSpecName: "pod-info") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.184035 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3119638a-6580-4a24-8e7f-40f7f7d788a5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.184538 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.198954 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data" (OuterVolumeSpecName: "config-data") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.202577 4763 scope.go:117] "RemoveContainer" containerID="ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.224328 4763 scope.go:117] "RemoveContainer" containerID="b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.227611 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-server-conf" (OuterVolumeSpecName: "server-conf") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.247127 4763 scope.go:117] "RemoveContainer" containerID="ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.247758 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc\": container with ID starting with ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc not found: ID does not exist" containerID="ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.247790 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc"} err="failed to get container status \"ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc\": rpc error: code = NotFound desc = could not find container \"ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc\": container with ID starting with ee0a430edb42a7272bf01f75b24f3cf801eff8e40c5e3e55524936c34ff763bc not found: ID does not exist" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.247811 4763 scope.go:117] "RemoveContainer" containerID="b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.248188 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11\": container with ID starting with b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11 not found: ID does not exist" containerID="b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.248234 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11"} err="failed to get container status \"b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11\": rpc error: code = NotFound desc = could not find container \"b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11\": container with ID starting with b2df90f2107573ddc9d8f21b00c91756d25c6d1d3fb13f0ff87a67bff27b7f11 not found: ID does not exist" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.248262 4763 scope.go:117] "RemoveContainer" containerID="9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.249395 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5\": container with ID starting with 9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5 not found: ID does not exist" containerID="9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.249424 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5"} err="failed to get container status \"9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5\": rpc error: code = NotFound desc = could not find container \"9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5\": container with ID starting with 9b0f7dc91fc0c9c506cf0c051205ef84a046e4fc8a698b70bf872a88d4da4aa5 not found: ID does not exist" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.249444 4763 scope.go:117] "RemoveContainer" containerID="6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.249738 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152\": container with ID starting with 6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152 not found: ID does not exist" containerID="6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.249766 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152"} err="failed to get container status \"6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152\": rpc error: code = NotFound desc = could not find container \"6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152\": container with ID starting with 6b177eb74be0b616bfe826746bd95b3433b2738858970194859a3a942956e152 not found: ID does not exist" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.249786 4763 scope.go:117] "RemoveContainer" containerID="28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.257785 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3119638a-6580-4a24-8e7f-40f7f7d788a5" (UID: "3119638a-6580-4a24-8e7f-40f7f7d788a5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282722 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5kwk\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-kube-api-access-f5kwk\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282755 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282764 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282789 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282799 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3119638a-6580-4a24-8e7f-40f7f7d788a5-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282807 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282818 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3119638a-6580-4a24-8e7f-40f7f7d788a5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282826 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282834 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3119638a-6580-4a24-8e7f-40f7f7d788a5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282844 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.282853 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3119638a-6580-4a24-8e7f-40f7f7d788a5-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.304471 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.311955 4763 scope.go:117] "RemoveContainer" containerID="ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.374116 4763 scope.go:117] "RemoveContainer" containerID="28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.375651 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9\": container with ID starting with 28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9 not found: ID does not exist" containerID="28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.375751 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9"} err="failed to get container status \"28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9\": rpc error: code = NotFound desc = could not find container \"28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9\": container with ID starting with 28c6fa485db3f8e4445a35ac82ce78b5f23afb415b319059e8e8cb4bdc656ed9 not found: ID does not exist" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.375786 4763 scope.go:117] "RemoveContainer" containerID="ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.376016 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707\": container with ID starting with ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707 not found: ID does not exist" containerID="ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.376043 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707"} err="failed to get container status \"ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707\": rpc error: code = NotFound desc = could not find container \"ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707\": container with ID starting with ff37408ee60ad29cf03210fef1245b97ec841c65e99971c346a9fb9b590c9707 not found: ID does not exist" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.376061 4763 scope.go:117] "RemoveContainer" containerID="59b75d8a10fde456e075a28d38cdb8ef12838b4b2acfbdbbde03b04350659d72" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.384038 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.423886 4763 scope.go:117] "RemoveContainer" containerID="e8da034aa8e3585dad3aebd273d533766fe99e7f47d29b8c0da60ce7e190c340" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.432122 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.443108 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.449079 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.453941 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.454847 4763 scope.go:117] "RemoveContainer" containerID="777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.476875 4763 scope.go:117] "RemoveContainer" containerID="e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.516302 4763 scope.go:117] "RemoveContainer" containerID="777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.517137 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b\": container with ID starting with 777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b not found: ID does not exist" containerID="777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.517180 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b"} err="failed to get container status \"777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b\": rpc error: code = NotFound desc = could not find container \"777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b\": container with ID starting with 777e1c3b46b790bf755a377299cb18a32b80c28fe9215759faf89de4f57bc66b not found: ID does not exist" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.517207 4763 scope.go:117] "RemoveContainer" containerID="e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.517531 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737\": container with ID starting with e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737 not found: ID does not exist" containerID="e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.517579 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737"} err="failed to get container status \"e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737\": rpc error: code = NotFound desc = could not find container \"e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737\": container with ID starting with e6ac06ee54359c7b60b5a3776cfd54b2e61f67a6e2d704a37edc71460dcbf737 not found: ID does not exist" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.517639 4763 scope.go:117] "RemoveContainer" containerID="15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.518003 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7\": container with ID starting with 15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7 not found: ID does not exist" containerID="15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.518052 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7"} err="failed to get container status \"15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7\": rpc error: code = NotFound desc = could not find container \"15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7\": container with ID starting with 15b7f1be1ac2be0a27f49ce909b5c7b8cf5df243e1ed582f16483ec942407ce7 not found: ID does not exist" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.518084 4763 scope.go:117] "RemoveContainer" containerID="963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.519334 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1\": container with ID starting with 963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1 not found: ID does not exist" containerID="963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1" Sep 30 13:58:17 crc kubenswrapper[4763]: I0930 13:58:17.519881 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1"} err="failed to get container status \"963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1\": rpc error: code = NotFound desc = could not find container \"963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1\": container with ID starting with 963b2cb071fdfa8b7fee0348661b6d1ebf1257bb29c8bdc337063d361d750dd1 not found: ID does not exist" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.530469 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.531995 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.533329 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.533367 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7373f404-a756-4321-bd57-e8d60585abff" containerName="nova-cell0-conductor-conductor" Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.801752 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.802852 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.804192 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:58:17 crc kubenswrapper[4763]: E0930 13:58:17.804294 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c4737b72-6133-4316-8b4e-1a7a3938cd05" containerName="nova-scheduler-scheduler" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.055500 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.056028 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3119638a-6580-4a24-8e7f-40f7f7d788a5","Type":"ContainerDied","Data":"0f74623c31148386b7f2ed927d954b6a1888fb4461a3a1afe66adce24f49d293"} Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.056063 4763 scope.go:117] "RemoveContainer" containerID="c5af37dfd26586dfbe5d5f60114f298ea522d4e3bbbc87c8e965efa23a5cf953" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.056140 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.068969 4763 generic.go:334] "Generic (PLEG): container finished" podID="0cf247fc-bc61-4305-b8a5-19ac60eba62a" containerID="ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4" exitCode=0 Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.069034 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5775d899cd-b25ch" event={"ID":"0cf247fc-bc61-4305-b8a5-19ac60eba62a","Type":"ContainerDied","Data":"ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4"} Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.069068 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5775d899cd-b25ch" event={"ID":"0cf247fc-bc61-4305-b8a5-19ac60eba62a","Type":"ContainerDied","Data":"ad395800e989605b6f3a8e35f1d2619b2d7eddf6583445bd471a7c51aab1d6ae"} Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.069069 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5775d899cd-b25ch" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.085137 4763 scope.go:117] "RemoveContainer" containerID="f4cd7078c0ddc04e2c8e6651fa5ad9a35e37bd449097a1bdf9256ab5a071cf30" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.109729 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.110048 4763 scope.go:117] "RemoveContainer" containerID="ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.118000 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.137814 4763 scope.go:117] "RemoveContainer" containerID="ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4" Sep 30 13:58:18 crc kubenswrapper[4763]: E0930 13:58:18.138309 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4\": container with ID starting with ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4 not found: ID does not exist" containerID="ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.138339 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4"} err="failed to get container status \"ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4\": rpc error: code = NotFound desc = could not find container \"ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4\": container with ID starting with ed765b6243a5a7ff543e367d3e00ce3d62a077862650456f5448785aba2df0d4 not found: ID does not exist" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.201411 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-internal-tls-certs\") pod \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.201497 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-public-tls-certs\") pod \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.201523 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-combined-ca-bundle\") pod \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.201574 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-fernet-keys\") pod \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.201642 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-credential-keys\") pod \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.201667 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-scripts\") pod \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.201702 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9jk9\" (UniqueName: \"kubernetes.io/projected/0cf247fc-bc61-4305-b8a5-19ac60eba62a-kube-api-access-j9jk9\") pod \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.201825 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-config-data\") pod \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\" (UID: \"0cf247fc-bc61-4305-b8a5-19ac60eba62a\") " Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.206076 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0cf247fc-bc61-4305-b8a5-19ac60eba62a" (UID: "0cf247fc-bc61-4305-b8a5-19ac60eba62a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.206773 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-scripts" (OuterVolumeSpecName: "scripts") pod "0cf247fc-bc61-4305-b8a5-19ac60eba62a" (UID: "0cf247fc-bc61-4305-b8a5-19ac60eba62a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.207032 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf247fc-bc61-4305-b8a5-19ac60eba62a-kube-api-access-j9jk9" (OuterVolumeSpecName: "kube-api-access-j9jk9") pod "0cf247fc-bc61-4305-b8a5-19ac60eba62a" (UID: "0cf247fc-bc61-4305-b8a5-19ac60eba62a"). InnerVolumeSpecName "kube-api-access-j9jk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.207190 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0cf247fc-bc61-4305-b8a5-19ac60eba62a" (UID: "0cf247fc-bc61-4305-b8a5-19ac60eba62a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.229874 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-config-data" (OuterVolumeSpecName: "config-data") pod "0cf247fc-bc61-4305-b8a5-19ac60eba62a" (UID: "0cf247fc-bc61-4305-b8a5-19ac60eba62a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.231711 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cf247fc-bc61-4305-b8a5-19ac60eba62a" (UID: "0cf247fc-bc61-4305-b8a5-19ac60eba62a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.240149 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.157:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.240521 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-9b5dc4bf7-vwl5v" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.157:8080/healthcheck\": context deadline exceeded" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.266886 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0cf247fc-bc61-4305-b8a5-19ac60eba62a" (UID: "0cf247fc-bc61-4305-b8a5-19ac60eba62a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.271373 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0cf247fc-bc61-4305-b8a5-19ac60eba62a" (UID: "0cf247fc-bc61-4305-b8a5-19ac60eba62a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.303324 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.303361 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.303375 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.303386 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.303397 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.303409 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9jk9\" (UniqueName: \"kubernetes.io/projected/0cf247fc-bc61-4305-b8a5-19ac60eba62a-kube-api-access-j9jk9\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.303422 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.303435 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf247fc-bc61-4305-b8a5-19ac60eba62a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.401650 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5775d899cd-b25ch"] Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.407555 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5775d899cd-b25ch"] Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.501101 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf247fc-bc61-4305-b8a5-19ac60eba62a" path="/var/lib/kubelet/pods/0cf247fc-bc61-4305-b8a5-19ac60eba62a/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.501951 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3119638a-6580-4a24-8e7f-40f7f7d788a5" path="/var/lib/kubelet/pods/3119638a-6580-4a24-8e7f-40f7f7d788a5/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.502639 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" path="/var/lib/kubelet/pods/5ed0d19e-bbae-437d-9083-cded205c65f6/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.503589 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" path="/var/lib/kubelet/pods/6002d74d-668d-4f30-b13a-c87ec6a8a3b8/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.504126 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" path="/var/lib/kubelet/pods/709e8d49-783d-44fb-8bcb-0b4ac2199efe/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.504621 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9bf16b-039c-46ba-ae41-f0622530202d" path="/var/lib/kubelet/pods/7b9bf16b-039c-46ba-ae41-f0622530202d/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.505488 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb91013-85e0-4a13-9a06-0608b16a147b" path="/var/lib/kubelet/pods/8bb91013-85e0-4a13-9a06-0608b16a147b/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.506280 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" path="/var/lib/kubelet/pods/916727b2-6488-4edf-b33b-c5908eae0e41/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.507852 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" path="/var/lib/kubelet/pods/a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.508317 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea8c25c-f29f-49ba-ab27-87c8661479ab" path="/var/lib/kubelet/pods/aea8c25c-f29f-49ba-ab27-87c8661479ab/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.509002 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" path="/var/lib/kubelet/pods/aebd5213-18eb-4d84-b39e-fd22f9ff9a6c/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.510639 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b321cfd6-9039-4fe6-a39c-619f101d5e30" path="/var/lib/kubelet/pods/b321cfd6-9039-4fe6-a39c-619f101d5e30/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.511802 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba159e27-7a3b-4b90-a7db-de6135f8153c" path="/var/lib/kubelet/pods/ba159e27-7a3b-4b90-a7db-de6135f8153c/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.513295 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc331486-cb31-4169-a564-51f8527ec8dd" path="/var/lib/kubelet/pods/bc331486-cb31-4169-a564-51f8527ec8dd/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.514121 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce55d11a-887c-46e6-af05-90c3fca01e75" path="/var/lib/kubelet/pods/ce55d11a-887c-46e6-af05-90c3fca01e75/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.514926 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f7940e-dedf-45a0-97b4-dc825dc00fc5" path="/var/lib/kubelet/pods/e5f7940e-dedf-45a0-97b4-dc825dc00fc5/volumes" Sep 30 13:58:18 crc kubenswrapper[4763]: I0930 13:58:18.994008 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.063188 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.087519 4763 generic.go:334] "Generic (PLEG): container finished" podID="aefbc43e-494e-48a6-963c-7be9d0159387" containerID="585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1" exitCode=0 Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.087636 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aefbc43e-494e-48a6-963c-7be9d0159387","Type":"ContainerDied","Data":"585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1"} Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.091510 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.095275 4763 generic.go:334] "Generic (PLEG): container finished" podID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerID="9f9834fea4bcdd3ebd6df5293d9dbc06c9286884eddc735ba525f57b0060ea83" exitCode=0 Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.095378 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dafb3edf-a4c0-4131-ad09-f836de63ff6b","Type":"ContainerDied","Data":"9f9834fea4bcdd3ebd6df5293d9dbc06c9286884eddc735ba525f57b0060ea83"} Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.095633 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.097094 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.097136 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="2b7af94e-accb-45ca-af30-c489c8d77b12" containerName="galera" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.104192 4763 generic.go:334] "Generic (PLEG): container finished" podID="c4737b72-6133-4316-8b4e-1a7a3938cd05" containerID="41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" exitCode=0 Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.104284 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c4737b72-6133-4316-8b4e-1a7a3938cd05","Type":"ContainerDied","Data":"41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856"} Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.104321 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.104364 4763 scope.go:117] "RemoveContainer" containerID="41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.104349 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c4737b72-6133-4316-8b4e-1a7a3938cd05","Type":"ContainerDied","Data":"7191da17e42ff72a38a1c8d86c161c95ea806ef10e01af229fefb6d267c6ebef"} Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.108172 4763 generic.go:334] "Generic (PLEG): container finished" podID="7373f404-a756-4321-bd57-e8d60585abff" containerID="605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" exitCode=0 Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.108236 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7373f404-a756-4321-bd57-e8d60585abff","Type":"ContainerDied","Data":"605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043"} Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.108266 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7373f404-a756-4321-bd57-e8d60585abff","Type":"ContainerDied","Data":"782e224536d49e89d38a2969163c696e56ed68e8a16ec5bb32cd6a12094b6e22"} Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.108347 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.113764 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-combined-ca-bundle\") pod \"c4737b72-6133-4316-8b4e-1a7a3938cd05\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.114032 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjwxk\" (UniqueName: \"kubernetes.io/projected/c4737b72-6133-4316-8b4e-1a7a3938cd05-kube-api-access-jjwxk\") pod \"c4737b72-6133-4316-8b4e-1a7a3938cd05\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.114159 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-config-data\") pod \"c4737b72-6133-4316-8b4e-1a7a3938cd05\" (UID: \"c4737b72-6133-4316-8b4e-1a7a3938cd05\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.115858 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.118463 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4737b72-6133-4316-8b4e-1a7a3938cd05-kube-api-access-jjwxk" (OuterVolumeSpecName: "kube-api-access-jjwxk") pod "c4737b72-6133-4316-8b4e-1a7a3938cd05" (UID: "c4737b72-6133-4316-8b4e-1a7a3938cd05"). InnerVolumeSpecName "kube-api-access-jjwxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.142834 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4737b72-6133-4316-8b4e-1a7a3938cd05" (UID: "c4737b72-6133-4316-8b4e-1a7a3938cd05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.153463 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-config-data" (OuterVolumeSpecName: "config-data") pod "c4737b72-6133-4316-8b4e-1a7a3938cd05" (UID: "c4737b72-6133-4316-8b4e-1a7a3938cd05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.154336 4763 scope.go:117] "RemoveContainer" containerID="41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.158050 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856\": container with ID starting with 41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856 not found: ID does not exist" containerID="41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.158571 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856"} err="failed to get container status \"41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856\": rpc error: code = NotFound desc = could not find container \"41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856\": container with ID starting with 41387ddc23397a6f2bc00acc6992ab544083c2735073f342808aa85567701856 not found: ID does not exist" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.158924 4763 scope.go:117] "RemoveContainer" containerID="605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.191698 4763 scope.go:117] "RemoveContainer" containerID="605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.192022 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043\": container with ID starting with 605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043 not found: ID does not exist" containerID="605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.192062 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043"} err="failed to get container status \"605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043\": rpc error: code = NotFound desc = could not find container \"605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043\": container with ID starting with 605a2a37ffdba7dc1cff6bba64dd7e6fa5fbbbe93a1e5cb699974006a719c043 not found: ID does not exist" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.214997 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-config-data\") pod \"7373f404-a756-4321-bd57-e8d60585abff\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.215060 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-combined-ca-bundle\") pod \"7373f404-a756-4321-bd57-e8d60585abff\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.215166 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cff6\" (UniqueName: \"kubernetes.io/projected/7373f404-a756-4321-bd57-e8d60585abff-kube-api-access-4cff6\") pod \"7373f404-a756-4321-bd57-e8d60585abff\" (UID: \"7373f404-a756-4321-bd57-e8d60585abff\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.215471 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.215489 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjwxk\" (UniqueName: \"kubernetes.io/projected/c4737b72-6133-4316-8b4e-1a7a3938cd05-kube-api-access-jjwxk\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.215505 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4737b72-6133-4316-8b4e-1a7a3938cd05-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.218180 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7373f404-a756-4321-bd57-e8d60585abff-kube-api-access-4cff6" (OuterVolumeSpecName: "kube-api-access-4cff6") pod "7373f404-a756-4321-bd57-e8d60585abff" (UID: "7373f404-a756-4321-bd57-e8d60585abff"). InnerVolumeSpecName "kube-api-access-4cff6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.235782 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7373f404-a756-4321-bd57-e8d60585abff" (UID: "7373f404-a756-4321-bd57-e8d60585abff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.249538 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-config-data" (OuterVolumeSpecName: "config-data") pod "7373f404-a756-4321-bd57-e8d60585abff" (UID: "7373f404-a756-4321-bd57-e8d60585abff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.316455 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-config-data\") pod \"aefbc43e-494e-48a6-963c-7be9d0159387\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.316561 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7959k\" (UniqueName: \"kubernetes.io/projected/aefbc43e-494e-48a6-963c-7be9d0159387-kube-api-access-7959k\") pod \"aefbc43e-494e-48a6-963c-7be9d0159387\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.316636 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-combined-ca-bundle\") pod \"aefbc43e-494e-48a6-963c-7be9d0159387\" (UID: \"aefbc43e-494e-48a6-963c-7be9d0159387\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.316892 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.316908 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cff6\" (UniqueName: \"kubernetes.io/projected/7373f404-a756-4321-bd57-e8d60585abff-kube-api-access-4cff6\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.316918 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7373f404-a756-4321-bd57-e8d60585abff-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.320117 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aefbc43e-494e-48a6-963c-7be9d0159387-kube-api-access-7959k" (OuterVolumeSpecName: "kube-api-access-7959k") pod "aefbc43e-494e-48a6-963c-7be9d0159387" (UID: "aefbc43e-494e-48a6-963c-7be9d0159387"). InnerVolumeSpecName "kube-api-access-7959k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.336556 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-config-data" (OuterVolumeSpecName: "config-data") pod "aefbc43e-494e-48a6-963c-7be9d0159387" (UID: "aefbc43e-494e-48a6-963c-7be9d0159387"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.340740 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aefbc43e-494e-48a6-963c-7be9d0159387" (UID: "aefbc43e-494e-48a6-963c-7be9d0159387"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.418465 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.418507 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7959k\" (UniqueName: \"kubernetes.io/projected/aefbc43e-494e-48a6-963c-7be9d0159387-kube-api-access-7959k\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.418521 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aefbc43e-494e-48a6-963c-7be9d0159387-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.457525 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.476426 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.480359 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.495295 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.503011 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.620340 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5g7d\" (UniqueName: \"kubernetes.io/projected/dafb3edf-a4c0-4131-ad09-f836de63ff6b-kube-api-access-v5g7d\") pod \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.620782 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-log-httpd\") pod \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.620856 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-combined-ca-bundle\") pod \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.620899 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-scripts\") pod \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.620933 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-sg-core-conf-yaml\") pod \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.620983 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-config-data\") pod \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.621017 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-run-httpd\") pod \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.621086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-ceilometer-tls-certs\") pod \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\" (UID: \"dafb3edf-a4c0-4131-ad09-f836de63ff6b\") " Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.625398 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dafb3edf-a4c0-4131-ad09-f836de63ff6b" (UID: "dafb3edf-a4c0-4131-ad09-f836de63ff6b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.625663 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-scripts" (OuterVolumeSpecName: "scripts") pod "dafb3edf-a4c0-4131-ad09-f836de63ff6b" (UID: "dafb3edf-a4c0-4131-ad09-f836de63ff6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.626024 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dafb3edf-a4c0-4131-ad09-f836de63ff6b" (UID: "dafb3edf-a4c0-4131-ad09-f836de63ff6b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.642409 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dafb3edf-a4c0-4131-ad09-f836de63ff6b-kube-api-access-v5g7d" (OuterVolumeSpecName: "kube-api-access-v5g7d") pod "dafb3edf-a4c0-4131-ad09-f836de63ff6b" (UID: "dafb3edf-a4c0-4131-ad09-f836de63ff6b"). InnerVolumeSpecName "kube-api-access-v5g7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.662581 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dafb3edf-a4c0-4131-ad09-f836de63ff6b" (UID: "dafb3edf-a4c0-4131-ad09-f836de63ff6b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.665862 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dafb3edf-a4c0-4131-ad09-f836de63ff6b" (UID: "dafb3edf-a4c0-4131-ad09-f836de63ff6b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.686396 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dafb3edf-a4c0-4131-ad09-f836de63ff6b" (UID: "dafb3edf-a4c0-4131-ad09-f836de63ff6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.702125 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-config-data" (OuterVolumeSpecName: "config-data") pod "dafb3edf-a4c0-4131-ad09-f836de63ff6b" (UID: "dafb3edf-a4c0-4131-ad09-f836de63ff6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.726763 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.726799 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5g7d\" (UniqueName: \"kubernetes.io/projected/dafb3edf-a4c0-4131-ad09-f836de63ff6b-kube-api-access-v5g7d\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.726813 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.726822 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.726830 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.726838 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.726846 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafb3edf-a4c0-4131-ad09-f836de63ff6b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: I0930 13:58:19.726854 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dafb3edf-a4c0-4131-ad09-f836de63ff6b-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.966642 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.967036 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.967330 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.967364 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.970795 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.972084 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.973244 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:19 crc kubenswrapper[4763]: E0930 13:58:19.973281 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovs-vswitchd" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.129924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dafb3edf-a4c0-4131-ad09-f836de63ff6b","Type":"ContainerDied","Data":"206bc2e365fe97699afc965b9cd36f62f6b07203198419a0dace4fa2fc38a5cf"} Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.129974 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.129985 4763 scope.go:117] "RemoveContainer" containerID="853ee2a8bbade3ee4f3a22fb7a290fda5a987ec05e7c37f8d30f4d981573d587" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.140022 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aefbc43e-494e-48a6-963c-7be9d0159387","Type":"ContainerDied","Data":"dcdc4f01401a7f30f315ce3f13aa64e2c734af8d34f6c95274ea3b5881d96cf8"} Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.140136 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.166461 4763 scope.go:117] "RemoveContainer" containerID="e4e037d3269bde82dabfcdad3ed6cb4f0b5a8c6d24b989ac4a2b7ef124409e4b" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.166703 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.176614 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.188206 4763 scope.go:117] "RemoveContainer" containerID="9f9834fea4bcdd3ebd6df5293d9dbc06c9286884eddc735ba525f57b0060ea83" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.202458 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.207774 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.218082 4763 scope.go:117] "RemoveContainer" containerID="4050fb5fd9697e750ea813cde28a4d185f69fe05aa260d270466a73fd43cd815" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.238410 4763 scope.go:117] "RemoveContainer" containerID="585dbe18ef01f1b67ce719782f36c12ee759638e77c5c8ef61ae81a0620d03f1" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.505135 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7373f404-a756-4321-bd57-e8d60585abff" path="/var/lib/kubelet/pods/7373f404-a756-4321-bd57-e8d60585abff/volumes" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.506021 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aefbc43e-494e-48a6-963c-7be9d0159387" path="/var/lib/kubelet/pods/aefbc43e-494e-48a6-963c-7be9d0159387/volumes" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.506713 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4737b72-6133-4316-8b4e-1a7a3938cd05" path="/var/lib/kubelet/pods/c4737b72-6133-4316-8b4e-1a7a3938cd05/volumes" Sep 30 13:58:20 crc kubenswrapper[4763]: I0930 13:58:20.508127 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" path="/var/lib/kubelet/pods/dafb3edf-a4c0-4131-ad09-f836de63ff6b/volumes" Sep 30 13:58:21 crc kubenswrapper[4763]: I0930 13:58:21.638933 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: i/o timeout" Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.755808 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.905974 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-kolla-config\") pod \"2b7af94e-accb-45ca-af30-c489c8d77b12\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.906023 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-default\") pod \"2b7af94e-accb-45ca-af30-c489c8d77b12\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.906061 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-generated\") pod \"2b7af94e-accb-45ca-af30-c489c8d77b12\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.906099 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-galera-tls-certs\") pod \"2b7af94e-accb-45ca-af30-c489c8d77b12\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.906153 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-combined-ca-bundle\") pod \"2b7af94e-accb-45ca-af30-c489c8d77b12\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.906174 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28mlb\" (UniqueName: \"kubernetes.io/projected/2b7af94e-accb-45ca-af30-c489c8d77b12-kube-api-access-28mlb\") pod \"2b7af94e-accb-45ca-af30-c489c8d77b12\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.906207 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-secrets\") pod \"2b7af94e-accb-45ca-af30-c489c8d77b12\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.906234 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-operator-scripts\") pod \"2b7af94e-accb-45ca-af30-c489c8d77b12\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.906255 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2b7af94e-accb-45ca-af30-c489c8d77b12\" (UID: \"2b7af94e-accb-45ca-af30-c489c8d77b12\") " Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.906805 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2b7af94e-accb-45ca-af30-c489c8d77b12" (UID: "2b7af94e-accb-45ca-af30-c489c8d77b12"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.906810 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2b7af94e-accb-45ca-af30-c489c8d77b12" (UID: "2b7af94e-accb-45ca-af30-c489c8d77b12"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.907142 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2b7af94e-accb-45ca-af30-c489c8d77b12" (UID: "2b7af94e-accb-45ca-af30-c489c8d77b12"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.907295 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b7af94e-accb-45ca-af30-c489c8d77b12" (UID: "2b7af94e-accb-45ca-af30-c489c8d77b12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.917858 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7af94e-accb-45ca-af30-c489c8d77b12-kube-api-access-28mlb" (OuterVolumeSpecName: "kube-api-access-28mlb") pod "2b7af94e-accb-45ca-af30-c489c8d77b12" (UID: "2b7af94e-accb-45ca-af30-c489c8d77b12"). InnerVolumeSpecName "kube-api-access-28mlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.918209 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-secrets" (OuterVolumeSpecName: "secrets") pod "2b7af94e-accb-45ca-af30-c489c8d77b12" (UID: "2b7af94e-accb-45ca-af30-c489c8d77b12"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.933727 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "2b7af94e-accb-45ca-af30-c489c8d77b12" (UID: "2b7af94e-accb-45ca-af30-c489c8d77b12"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.937509 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b7af94e-accb-45ca-af30-c489c8d77b12" (UID: "2b7af94e-accb-45ca-af30-c489c8d77b12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:24 crc kubenswrapper[4763]: I0930 13:58:24.951683 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2b7af94e-accb-45ca-af30-c489c8d77b12" (UID: "2b7af94e-accb-45ca-af30-c489c8d77b12"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:24 crc kubenswrapper[4763]: E0930 13:58:24.968696 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:24 crc kubenswrapper[4763]: E0930 13:58:24.969158 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:24 crc kubenswrapper[4763]: E0930 13:58:24.969859 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:24 crc kubenswrapper[4763]: E0930 13:58:24.969943 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" Sep 30 13:58:24 crc kubenswrapper[4763]: E0930 13:58:24.970759 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:24 crc kubenswrapper[4763]: E0930 13:58:24.973470 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:24 crc kubenswrapper[4763]: E0930 13:58:24.975970 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:24 crc kubenswrapper[4763]: E0930 13:58:24.976037 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovs-vswitchd" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.007884 4763 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-kolla-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.007923 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-default\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.007938 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b7af94e-accb-45ca-af30-c489c8d77b12-config-data-generated\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.007952 4763 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.007967 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.007979 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28mlb\" (UniqueName: \"kubernetes.io/projected/2b7af94e-accb-45ca-af30-c489c8d77b12-kube-api-access-28mlb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.007990 4763 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2b7af94e-accb-45ca-af30-c489c8d77b12-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.008001 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b7af94e-accb-45ca-af30-c489c8d77b12-operator-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.008052 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.024776 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.109313 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.183026 4763 generic.go:334] "Generic (PLEG): container finished" podID="2b7af94e-accb-45ca-af30-c489c8d77b12" containerID="5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97" exitCode=0 Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.183071 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b7af94e-accb-45ca-af30-c489c8d77b12","Type":"ContainerDied","Data":"5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97"} Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.183097 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b7af94e-accb-45ca-af30-c489c8d77b12","Type":"ContainerDied","Data":"3fa32050d777708047fe02f0901ecc0ed4c915235281a6ed52411d21b1bdd265"} Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.183115 4763 scope.go:117] "RemoveContainer" containerID="5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.183242 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.220431 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.225632 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.229219 4763 scope.go:117] "RemoveContainer" containerID="19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.254939 4763 scope.go:117] "RemoveContainer" containerID="5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97" Sep 30 13:58:25 crc kubenswrapper[4763]: E0930 13:58:25.262063 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97\": container with ID starting with 5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97 not found: ID does not exist" containerID="5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.262106 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97"} err="failed to get container status \"5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97\": rpc error: code = NotFound desc = could not find container \"5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97\": container with ID starting with 5b9937ceaae3f049c574c656d5283559d77ed0993aa352ee024f31f140a70f97 not found: ID does not exist" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.262135 4763 scope.go:117] "RemoveContainer" containerID="19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399" Sep 30 13:58:25 crc kubenswrapper[4763]: E0930 13:58:25.262618 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399\": container with ID starting with 19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399 not found: ID does not exist" containerID="19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399" Sep 30 13:58:25 crc kubenswrapper[4763]: I0930 13:58:25.262653 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399"} err="failed to get container status \"19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399\": rpc error: code = NotFound desc = could not find container \"19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399\": container with ID starting with 19505c76839287c7296ce9c35292829dc63226fbb77f1085cc05a2c30bf22399 not found: ID does not exist" Sep 30 13:58:26 crc kubenswrapper[4763]: I0930 13:58:26.498152 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7af94e-accb-45ca-af30-c489c8d77b12" path="/var/lib/kubelet/pods/2b7af94e-accb-45ca-af30-c489c8d77b12/volumes" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.103287 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.211464 4763 generic.go:334] "Generic (PLEG): container finished" podID="02c33b2c-ca4f-45a8-9920-63df9fc79108" containerID="e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb" exitCode=0 Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.211512 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75bcdb8fc9-ml4n8" event={"ID":"02c33b2c-ca4f-45a8-9920-63df9fc79108","Type":"ContainerDied","Data":"e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb"} Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.211540 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75bcdb8fc9-ml4n8" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.211568 4763 scope.go:117] "RemoveContainer" containerID="d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.211555 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75bcdb8fc9-ml4n8" event={"ID":"02c33b2c-ca4f-45a8-9920-63df9fc79108","Type":"ContainerDied","Data":"04c023bec9e4143659e633d27f5d5e3b1f39ebaa173001f3ab9f9c3a93389c0d"} Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.232815 4763 scope.go:117] "RemoveContainer" containerID="e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.249232 4763 scope.go:117] "RemoveContainer" containerID="d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb" Sep 30 13:58:28 crc kubenswrapper[4763]: E0930 13:58:28.249660 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb\": container with ID starting with d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb not found: ID does not exist" containerID="d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.249785 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb"} err="failed to get container status \"d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb\": rpc error: code = NotFound desc = could not find container \"d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb\": container with ID starting with d5f28a9bce0df2a1dee078f5ea4d0fcdcda785aadbbf2181635597f8a20d03bb not found: ID does not exist" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.249819 4763 scope.go:117] "RemoveContainer" containerID="e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb" Sep 30 13:58:28 crc kubenswrapper[4763]: E0930 13:58:28.250121 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb\": container with ID starting with e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb not found: ID does not exist" containerID="e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.250151 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb"} err="failed to get container status \"e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb\": rpc error: code = NotFound desc = could not find container \"e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb\": container with ID starting with e5be5e5bde09198bd85cb9b6778d75cd0583b635b30d8a5a31150bd5c45730bb not found: ID does not exist" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.259653 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8zxh\" (UniqueName: \"kubernetes.io/projected/02c33b2c-ca4f-45a8-9920-63df9fc79108-kube-api-access-j8zxh\") pod \"02c33b2c-ca4f-45a8-9920-63df9fc79108\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.259685 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-public-tls-certs\") pod \"02c33b2c-ca4f-45a8-9920-63df9fc79108\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.259722 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-httpd-config\") pod \"02c33b2c-ca4f-45a8-9920-63df9fc79108\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.259773 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-config\") pod \"02c33b2c-ca4f-45a8-9920-63df9fc79108\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.259842 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-ovndb-tls-certs\") pod \"02c33b2c-ca4f-45a8-9920-63df9fc79108\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.259865 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-combined-ca-bundle\") pod \"02c33b2c-ca4f-45a8-9920-63df9fc79108\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.259882 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-internal-tls-certs\") pod \"02c33b2c-ca4f-45a8-9920-63df9fc79108\" (UID: \"02c33b2c-ca4f-45a8-9920-63df9fc79108\") " Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.263972 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "02c33b2c-ca4f-45a8-9920-63df9fc79108" (UID: "02c33b2c-ca4f-45a8-9920-63df9fc79108"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.264204 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c33b2c-ca4f-45a8-9920-63df9fc79108-kube-api-access-j8zxh" (OuterVolumeSpecName: "kube-api-access-j8zxh") pod "02c33b2c-ca4f-45a8-9920-63df9fc79108" (UID: "02c33b2c-ca4f-45a8-9920-63df9fc79108"). InnerVolumeSpecName "kube-api-access-j8zxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.296445 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "02c33b2c-ca4f-45a8-9920-63df9fc79108" (UID: "02c33b2c-ca4f-45a8-9920-63df9fc79108"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.298891 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-config" (OuterVolumeSpecName: "config") pod "02c33b2c-ca4f-45a8-9920-63df9fc79108" (UID: "02c33b2c-ca4f-45a8-9920-63df9fc79108"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.306591 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "02c33b2c-ca4f-45a8-9920-63df9fc79108" (UID: "02c33b2c-ca4f-45a8-9920-63df9fc79108"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.307393 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02c33b2c-ca4f-45a8-9920-63df9fc79108" (UID: "02c33b2c-ca4f-45a8-9920-63df9fc79108"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.315217 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "02c33b2c-ca4f-45a8-9920-63df9fc79108" (UID: "02c33b2c-ca4f-45a8-9920-63df9fc79108"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.361179 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.361225 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.361239 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8zxh\" (UniqueName: \"kubernetes.io/projected/02c33b2c-ca4f-45a8-9920-63df9fc79108-kube-api-access-j8zxh\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.361254 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.361264 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.361275 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.361287 4763 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02c33b2c-ca4f-45a8-9920-63df9fc79108-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.531734 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75bcdb8fc9-ml4n8"] Sep 30 13:58:28 crc kubenswrapper[4763]: I0930 13:58:28.536991 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-75bcdb8fc9-ml4n8"] Sep 30 13:58:29 crc kubenswrapper[4763]: E0930 13:58:29.966217 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:29 crc kubenswrapper[4763]: E0930 13:58:29.967064 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:29 crc kubenswrapper[4763]: E0930 13:58:29.967480 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:29 crc kubenswrapper[4763]: E0930 13:58:29.967521 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" Sep 30 13:58:29 crc kubenswrapper[4763]: E0930 13:58:29.967819 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:29 crc kubenswrapper[4763]: E0930 13:58:29.969540 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:29 crc kubenswrapper[4763]: E0930 13:58:29.971729 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:29 crc kubenswrapper[4763]: E0930 13:58:29.971770 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovs-vswitchd" Sep 30 13:58:30 crc kubenswrapper[4763]: I0930 13:58:30.500541 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c33b2c-ca4f-45a8-9920-63df9fc79108" path="/var/lib/kubelet/pods/02c33b2c-ca4f-45a8-9920-63df9fc79108/volumes" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.638526 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-26ghw"] Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.639338 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a588d68-fc19-4242-9b61-0ed79678fc9e" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.639406 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a588d68-fc19-4242-9b61-0ed79678fc9e" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.639458 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" containerName="kube-state-metrics" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.639527 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" containerName="kube-state-metrics" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.639578 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea8c25c-f29f-49ba-ab27-87c8661479ab" containerName="barbican-keystone-listener" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.639647 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea8c25c-f29f-49ba-ab27-87c8661479ab" containerName="barbican-keystone-listener" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.639704 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.639778 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.639848 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7550cde2-d6ca-4dc1-8772-5eea0a9b8142" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.639910 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7550cde2-d6ca-4dc1-8772-5eea0a9b8142" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.639991 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba159e27-7a3b-4b90-a7db-de6135f8153c" containerName="glance-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.640062 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba159e27-7a3b-4b90-a7db-de6135f8153c" containerName="glance-log" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.640127 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f3de76-2dd7-4d26-8010-72d5ff408190" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.640194 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f3de76-2dd7-4d26-8010-72d5ff408190" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.640266 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7af94e-accb-45ca-af30-c489c8d77b12" containerName="mysql-bootstrap" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.640326 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7af94e-accb-45ca-af30-c489c8d77b12" containerName="mysql-bootstrap" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.640398 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerName="ovsdbserver-nb" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.640471 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerName="ovsdbserver-nb" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.640547 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9f1fd7-72d5-4f11-b8c8-5e941597ca75" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.640623 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9f1fd7-72d5-4f11-b8c8-5e941597ca75" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.640678 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9bf16b-039c-46ba-ae41-f0622530202d" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.640741 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9bf16b-039c-46ba-ae41-f0622530202d" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.640837 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7af94e-accb-45ca-af30-c489c8d77b12" containerName="galera" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.640910 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7af94e-accb-45ca-af30-c489c8d77b12" containerName="galera" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.641005 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" containerName="init" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.641084 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" containerName="init" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.641156 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="ceilometer-central-agent" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.641220 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="ceilometer-central-agent" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.641296 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerName="barbican-api-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.641368 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerName="barbican-api-log" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.641438 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae274648-abe2-416e-a43d-edc836edc424" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.641488 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae274648-abe2-416e-a43d-edc836edc424" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.641538 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerName="placement-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.641586 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerName="placement-api" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.641667 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea8c25c-f29f-49ba-ab27-87c8661479ab" containerName="barbican-keystone-listener-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.641725 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea8c25c-f29f-49ba-ab27-87c8661479ab" containerName="barbican-keystone-listener-log" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.641781 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783d0307-40e6-4d1e-9728-b1fe356e6b52" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.641829 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="783d0307-40e6-4d1e-9728-b1fe356e6b52" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.641888 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerName="barbican-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.641944 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerName="barbican-api" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.642003 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c5264e-b119-4444-b954-c33b428294b5" containerName="galera" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.642068 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c5264e-b119-4444-b954-c33b428294b5" containerName="galera" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.642132 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba159e27-7a3b-4b90-a7db-de6135f8153c" containerName="glance-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.642189 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba159e27-7a3b-4b90-a7db-de6135f8153c" containerName="glance-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.642240 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7373f404-a756-4321-bd57-e8d60585abff" containerName="nova-cell0-conductor-conductor" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.642288 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7373f404-a756-4321-bd57-e8d60585abff" containerName="nova-cell0-conductor-conductor" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.642338 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerName="placement-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.642390 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerName="placement-log" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.642443 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerName="proxy-server" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.642490 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerName="proxy-server" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.642544 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc331486-cb31-4169-a564-51f8527ec8dd" containerName="barbican-worker" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.642609 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc331486-cb31-4169-a564-51f8527ec8dd" containerName="barbican-worker" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.642664 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="proxy-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.642713 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="proxy-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.642773 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerName="nova-metadata-metadata" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.642824 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerName="nova-metadata-metadata" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.642876 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c33b2c-ca4f-45a8-9920-63df9fc79108" containerName="neutron-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.642941 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c33b2c-ca4f-45a8-9920-63df9fc79108" containerName="neutron-api" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.642991 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerName="cinder-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.643045 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerName="cinder-api" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.643119 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" containerName="dnsmasq-dns" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.643191 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" containerName="dnsmasq-dns" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.643259 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b611e133-1d4a-49a8-9632-bdb825d41fa4" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.643323 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b611e133-1d4a-49a8-9632-bdb825d41fa4" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.643395 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf247fc-bc61-4305-b8a5-19ac60eba62a" containerName="keystone-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.643457 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf247fc-bc61-4305-b8a5-19ac60eba62a" containerName="keystone-api" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.643526 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e45a139-0079-45cc-89a9-b1a0b0c1d179" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.643584 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e45a139-0079-45cc-89a9-b1a0b0c1d179" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.643674 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce55d11a-887c-46e6-af05-90c3fca01e75" containerName="glance-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.643734 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce55d11a-887c-46e6-af05-90c3fca01e75" containerName="glance-log" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.643800 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce55d11a-887c-46e6-af05-90c3fca01e75" containerName="glance-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.643860 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce55d11a-887c-46e6-af05-90c3fca01e75" containerName="glance-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.643936 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerName="nova-api-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.644000 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerName="nova-api-log" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.644071 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c5264e-b119-4444-b954-c33b428294b5" containerName="mysql-bootstrap" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.644148 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c5264e-b119-4444-b954-c33b428294b5" containerName="mysql-bootstrap" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.644214 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.644272 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.644347 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc331486-cb31-4169-a564-51f8527ec8dd" containerName="barbican-worker-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.644419 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc331486-cb31-4169-a564-51f8527ec8dd" containerName="barbican-worker-log" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.644494 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb91013-85e0-4a13-9a06-0608b16a147b" containerName="probe" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.644573 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb91013-85e0-4a13-9a06-0608b16a147b" containerName="probe" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.644678 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" containerName="rabbitmq" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.644750 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" containerName="rabbitmq" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.644828 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b611e133-1d4a-49a8-9632-bdb825d41fa4" containerName="ovsdbserver-sb" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.644907 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b611e133-1d4a-49a8-9632-bdb825d41fa4" containerName="ovsdbserver-sb" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.645007 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3119638a-6580-4a24-8e7f-40f7f7d788a5" containerName="setup-container" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.645065 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3119638a-6580-4a24-8e7f-40f7f7d788a5" containerName="setup-container" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.645115 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3119638a-6580-4a24-8e7f-40f7f7d788a5" containerName="rabbitmq" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.645162 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3119638a-6580-4a24-8e7f-40f7f7d788a5" containerName="rabbitmq" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.645210 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerName="proxy-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.645274 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerName="proxy-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.645349 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerName="nova-metadata-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.645413 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerName="nova-metadata-log" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.645471 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb91013-85e0-4a13-9a06-0608b16a147b" containerName="cinder-scheduler" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.645519 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb91013-85e0-4a13-9a06-0608b16a147b" containerName="cinder-scheduler" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.645569 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerName="nova-api-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.645636 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerName="nova-api-api" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.645701 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="ceilometer-notification-agent" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.645751 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="ceilometer-notification-agent" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.645810 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f7940e-dedf-45a0-97b4-dc825dc00fc5" containerName="memcached" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.645859 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f7940e-dedf-45a0-97b4-dc825dc00fc5" containerName="memcached" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.645929 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" containerName="setup-container" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.645991 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" containerName="setup-container" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.646060 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db73295-0655-443c-91e0-2cd08b119141" containerName="ovn-controller" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.646111 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db73295-0655-443c-91e0-2cd08b119141" containerName="ovn-controller" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.646199 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4737b72-6133-4316-8b4e-1a7a3938cd05" containerName="nova-scheduler-scheduler" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.646255 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4737b72-6133-4316-8b4e-1a7a3938cd05" containerName="nova-scheduler-scheduler" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.646313 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aefbc43e-494e-48a6-963c-7be9d0159387" containerName="nova-cell1-conductor-conductor" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.646441 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefbc43e-494e-48a6-963c-7be9d0159387" containerName="nova-cell1-conductor-conductor" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.646502 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerName="cinder-api-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.646554 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerName="cinder-api-log" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.646642 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="sg-core" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.646701 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="sg-core" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.646770 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb2a96e-6374-4a22-a7fd-058bfdefac42" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.646820 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb2a96e-6374-4a22-a7fd-058bfdefac42" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.646878 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c33b2c-ca4f-45a8-9920-63df9fc79108" containerName="neutron-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.646941 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c33b2c-ca4f-45a8-9920-63df9fc79108" containerName="neutron-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: E0930 13:58:31.646997 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" containerName="ovn-northd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.647044 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" containerName="ovn-northd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.647354 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" containerName="ovn-northd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.647441 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerName="nova-metadata-metadata" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.647507 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerName="proxy-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.647580 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb91013-85e0-4a13-9a06-0608b16a147b" containerName="probe" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.647673 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerName="cinder-api-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.647738 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7373f404-a756-4321-bd57-e8d60585abff" containerName="nova-cell0-conductor-conductor" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.647819 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="sg-core" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.647891 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba159e27-7a3b-4b90-a7db-de6135f8153c" containerName="glance-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.647964 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerName="barbican-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648040 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="783d0307-40e6-4d1e-9728-b1fe356e6b52" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648111 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db73295-0655-443c-91e0-2cd08b119141" containerName="ovn-controller" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648181 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b611e133-1d4a-49a8-9632-bdb825d41fa4" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648262 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb91013-85e0-4a13-9a06-0608b16a147b" containerName="cinder-scheduler" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648335 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4737b72-6133-4316-8b4e-1a7a3938cd05" containerName="nova-scheduler-scheduler" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648412 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b321cfd6-9039-4fe6-a39c-619f101d5e30" containerName="cinder-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648498 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c5264e-b119-4444-b954-c33b428294b5" containerName="galera" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648576 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7af94e-accb-45ca-af30-c489c8d77b12" containerName="galera" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648665 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aebd5213-18eb-4d84-b39e-fd22f9ff9a6c" containerName="rabbitmq" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648807 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3119638a-6580-4a24-8e7f-40f7f7d788a5" containerName="rabbitmq" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648886 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc331486-cb31-4169-a564-51f8527ec8dd" containerName="barbican-worker-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.648989 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="709e8d49-783d-44fb-8bcb-0b4ac2199efe" containerName="nova-metadata-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649061 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="916727b2-6488-4edf-b33b-c5908eae0e41" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649135 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59aa3ef-4b40-49ed-9c8e-4bfb0ea225a0" containerName="kube-state-metrics" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649203 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f3de76-2dd7-4d26-8010-72d5ff408190" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649270 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerName="nova-api-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649336 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9f1fd7-72d5-4f11-b8c8-5e941597ca75" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649400 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="ceilometer-notification-agent" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649465 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b611e133-1d4a-49a8-9632-bdb825d41fa4" containerName="ovsdbserver-sb" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649539 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7550cde2-d6ca-4dc1-8772-5eea0a9b8142" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649624 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="ceilometer-central-agent" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649697 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce55d11a-887c-46e6-af05-90c3fca01e75" containerName="glance-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649763 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aefbc43e-494e-48a6-963c-7be9d0159387" containerName="nova-cell1-conductor-conductor" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649893 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e45a139-0079-45cc-89a9-b1a0b0c1d179" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.649967 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce55d11a-887c-46e6-af05-90c3fca01e75" containerName="glance-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650049 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb2a96e-6374-4a22-a7fd-058bfdefac42" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650118 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc331486-cb31-4169-a564-51f8527ec8dd" containerName="barbican-worker" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650197 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c33b2c-ca4f-45a8-9920-63df9fc79108" containerName="neutron-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650269 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea8c25c-f29f-49ba-ab27-87c8661479ab" containerName="barbican-keystone-listener" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650345 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c33b2c-ca4f-45a8-9920-63df9fc79108" containerName="neutron-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650418 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a588d68-fc19-4242-9b61-0ed79678fc9e" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650493 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerName="placement-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650561 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99f7915-f0b7-498a-941d-b02d87df4b98" containerName="proxy-server" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650757 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef00d68-6c21-4ee7-8be8-53f7c1edb2f3" containerName="dnsmasq-dns" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650831 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9bf16b-039c-46ba-ae41-f0622530202d" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.650908 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae274648-abe2-416e-a43d-edc836edc424" containerName="mariadb-account-delete" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.651001 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fbd312-35ac-4e62-ad60-ffccf94eab4a" containerName="placement-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.651079 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed0d19e-bbae-437d-9083-cded205c65f6" containerName="barbican-api-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.651142 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba159e27-7a3b-4b90-a7db-de6135f8153c" containerName="glance-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.651204 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea8c25c-f29f-49ba-ab27-87c8661479ab" containerName="barbican-keystone-listener-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.651271 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerName="openstack-network-exporter" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.651331 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf247fc-bc61-4305-b8a5-19ac60eba62a" containerName="keystone-api" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.651390 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002d74d-668d-4f30-b13a-c87ec6a8a3b8" containerName="nova-api-log" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.651449 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" containerName="ovsdbserver-nb" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.651520 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f7940e-dedf-45a0-97b4-dc825dc00fc5" containerName="memcached" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.651584 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafb3edf-a4c0-4131-ad09-f836de63ff6b" containerName="proxy-httpd" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.652799 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-26ghw"] Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.653015 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.815970 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb9ws\" (UniqueName: \"kubernetes.io/projected/3359bd24-0be4-4329-b72e-2a044567c7e9-kube-api-access-pb9ws\") pod \"redhat-marketplace-26ghw\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.816048 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-utilities\") pod \"redhat-marketplace-26ghw\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.816075 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-catalog-content\") pod \"redhat-marketplace-26ghw\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.917398 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-utilities\") pod \"redhat-marketplace-26ghw\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.917458 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-catalog-content\") pod \"redhat-marketplace-26ghw\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.917556 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb9ws\" (UniqueName: \"kubernetes.io/projected/3359bd24-0be4-4329-b72e-2a044567c7e9-kube-api-access-pb9ws\") pod \"redhat-marketplace-26ghw\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.918018 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-utilities\") pod \"redhat-marketplace-26ghw\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.918060 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-catalog-content\") pod \"redhat-marketplace-26ghw\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.935653 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb9ws\" (UniqueName: \"kubernetes.io/projected/3359bd24-0be4-4329-b72e-2a044567c7e9-kube-api-access-pb9ws\") pod \"redhat-marketplace-26ghw\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:31 crc kubenswrapper[4763]: I0930 13:58:31.969206 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:32 crc kubenswrapper[4763]: I0930 13:58:32.442527 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-26ghw"] Sep 30 13:58:33 crc kubenswrapper[4763]: I0930 13:58:33.252989 4763 generic.go:334] "Generic (PLEG): container finished" podID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerID="4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b" exitCode=0 Sep 30 13:58:33 crc kubenswrapper[4763]: I0930 13:58:33.253073 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26ghw" event={"ID":"3359bd24-0be4-4329-b72e-2a044567c7e9","Type":"ContainerDied","Data":"4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b"} Sep 30 13:58:33 crc kubenswrapper[4763]: I0930 13:58:33.253253 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26ghw" event={"ID":"3359bd24-0be4-4329-b72e-2a044567c7e9","Type":"ContainerStarted","Data":"12fdea992ac6f926c820bad70343e1abe8871bc6452a34de2c68eb6ccc082921"} Sep 30 13:58:33 crc kubenswrapper[4763]: I0930 13:58:33.255774 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:58:34 crc kubenswrapper[4763]: E0930 13:58:34.966815 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:34 crc kubenswrapper[4763]: E0930 13:58:34.968308 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:34 crc kubenswrapper[4763]: E0930 13:58:34.969047 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:34 crc kubenswrapper[4763]: E0930 13:58:34.969535 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:34 crc kubenswrapper[4763]: E0930 13:58:34.969577 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" Sep 30 13:58:34 crc kubenswrapper[4763]: E0930 13:58:34.970052 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:34 crc kubenswrapper[4763]: E0930 13:58:34.971543 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:34 crc kubenswrapper[4763]: E0930 13:58:34.971586 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovs-vswitchd" Sep 30 13:58:35 crc kubenswrapper[4763]: I0930 13:58:35.272629 4763 generic.go:334] "Generic (PLEG): container finished" podID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerID="5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94" exitCode=0 Sep 30 13:58:35 crc kubenswrapper[4763]: I0930 13:58:35.272680 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26ghw" event={"ID":"3359bd24-0be4-4329-b72e-2a044567c7e9","Type":"ContainerDied","Data":"5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94"} Sep 30 13:58:36 crc kubenswrapper[4763]: I0930 13:58:36.284309 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26ghw" event={"ID":"3359bd24-0be4-4329-b72e-2a044567c7e9","Type":"ContainerStarted","Data":"120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45"} Sep 30 13:58:36 crc kubenswrapper[4763]: I0930 13:58:36.311686 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-26ghw" podStartSLOduration=2.652702828 podStartE2EDuration="5.311666474s" podCreationTimestamp="2025-09-30 13:58:31 +0000 UTC" firstStartedPulling="2025-09-30 13:58:33.25546299 +0000 UTC m=+1385.394023295" lastFinishedPulling="2025-09-30 13:58:35.914426616 +0000 UTC m=+1388.052986941" observedRunningTime="2025-09-30 13:58:36.303311535 +0000 UTC m=+1388.441871820" watchObservedRunningTime="2025-09-30 13:58:36.311666474 +0000 UTC m=+1388.450226769" Sep 30 13:58:39 crc kubenswrapper[4763]: E0930 13:58:39.967367 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:39 crc kubenswrapper[4763]: E0930 13:58:39.969503 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:39 crc kubenswrapper[4763]: E0930 13:58:39.969533 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:39 crc kubenswrapper[4763]: E0930 13:58:39.970218 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 30 13:58:39 crc kubenswrapper[4763]: E0930 13:58:39.970252 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" Sep 30 13:58:39 crc kubenswrapper[4763]: E0930 13:58:39.971020 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:39 crc kubenswrapper[4763]: E0930 13:58:39.974839 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 30 13:58:39 crc kubenswrapper[4763]: E0930 13:58:39.974880 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-72z5c" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovs-vswitchd" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.330124 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-72z5c_08cae05d-3853-4e7a-a66c-380c023d086b/ovs-vswitchd/0.log" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.332179 4763 generic.go:334] "Generic (PLEG): container finished" podID="08cae05d-3853-4e7a-a66c-380c023d086b" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" exitCode=137 Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.332309 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-72z5c" event={"ID":"08cae05d-3853-4e7a-a66c-380c023d086b","Type":"ContainerDied","Data":"0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145"} Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.338853 4763 generic.go:334] "Generic (PLEG): container finished" podID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerID="c52de0c97e78063fc806d8831c3e0f7eba864de7670f488d153c3e4e13e7df72" exitCode=137 Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.338889 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"c52de0c97e78063fc806d8831c3e0f7eba864de7670f488d153c3e4e13e7df72"} Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.720973 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-72z5c_08cae05d-3853-4e7a-a66c-380c023d086b/ovs-vswitchd/0.log" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.721908 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.859938 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z6x5\" (UniqueName: \"kubernetes.io/projected/08cae05d-3853-4e7a-a66c-380c023d086b-kube-api-access-4z6x5\") pod \"08cae05d-3853-4e7a-a66c-380c023d086b\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860071 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-etc-ovs\") pod \"08cae05d-3853-4e7a-a66c-380c023d086b\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860101 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-log\") pod \"08cae05d-3853-4e7a-a66c-380c023d086b\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860193 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-run\") pod \"08cae05d-3853-4e7a-a66c-380c023d086b\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860232 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-lib\") pod \"08cae05d-3853-4e7a-a66c-380c023d086b\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860260 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08cae05d-3853-4e7a-a66c-380c023d086b-scripts\") pod \"08cae05d-3853-4e7a-a66c-380c023d086b\" (UID: \"08cae05d-3853-4e7a-a66c-380c023d086b\") " Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860479 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-log" (OuterVolumeSpecName: "var-log") pod "08cae05d-3853-4e7a-a66c-380c023d086b" (UID: "08cae05d-3853-4e7a-a66c-380c023d086b"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860541 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "08cae05d-3853-4e7a-a66c-380c023d086b" (UID: "08cae05d-3853-4e7a-a66c-380c023d086b"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860558 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-lib" (OuterVolumeSpecName: "var-lib") pod "08cae05d-3853-4e7a-a66c-380c023d086b" (UID: "08cae05d-3853-4e7a-a66c-380c023d086b"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860605 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-run" (OuterVolumeSpecName: "var-run") pod "08cae05d-3853-4e7a-a66c-380c023d086b" (UID: "08cae05d-3853-4e7a-a66c-380c023d086b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860656 4763 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-lib\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860674 4763 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-etc-ovs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.860686 4763 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-log\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.861970 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08cae05d-3853-4e7a-a66c-380c023d086b-scripts" (OuterVolumeSpecName: "scripts") pod "08cae05d-3853-4e7a-a66c-380c023d086b" (UID: "08cae05d-3853-4e7a-a66c-380c023d086b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.866094 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08cae05d-3853-4e7a-a66c-380c023d086b-kube-api-access-4z6x5" (OuterVolumeSpecName: "kube-api-access-4z6x5") pod "08cae05d-3853-4e7a-a66c-380c023d086b" (UID: "08cae05d-3853-4e7a-a66c-380c023d086b"). InnerVolumeSpecName "kube-api-access-4z6x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.962924 4763 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/08cae05d-3853-4e7a-a66c-380c023d086b-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.962953 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08cae05d-3853-4e7a-a66c-380c023d086b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.962962 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z6x5\" (UniqueName: \"kubernetes.io/projected/08cae05d-3853-4e7a-a66c-380c023d086b-kube-api-access-4z6x5\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.969879 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.969918 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:41 crc kubenswrapper[4763]: I0930 13:58:41.994510 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.014373 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.165013 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.165069 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-lock\") pod \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.165108 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rn4x\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-kube-api-access-6rn4x\") pod \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.165156 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift\") pod \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.165210 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-cache\") pod \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\" (UID: \"ba72f8d4-1822-4bb5-a099-c15d4b00b701\") " Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.166249 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-lock" (OuterVolumeSpecName: "lock") pod "ba72f8d4-1822-4bb5-a099-c15d4b00b701" (UID: "ba72f8d4-1822-4bb5-a099-c15d4b00b701"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.166335 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-cache" (OuterVolumeSpecName: "cache") pod "ba72f8d4-1822-4bb5-a099-c15d4b00b701" (UID: "ba72f8d4-1822-4bb5-a099-c15d4b00b701"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.168899 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "swift") pod "ba72f8d4-1822-4bb5-a099-c15d4b00b701" (UID: "ba72f8d4-1822-4bb5-a099-c15d4b00b701"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.170081 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ba72f8d4-1822-4bb5-a099-c15d4b00b701" (UID: "ba72f8d4-1822-4bb5-a099-c15d4b00b701"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.170732 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-kube-api-access-6rn4x" (OuterVolumeSpecName: "kube-api-access-6rn4x") pod "ba72f8d4-1822-4bb5-a099-c15d4b00b701" (UID: "ba72f8d4-1822-4bb5-a099-c15d4b00b701"). InnerVolumeSpecName "kube-api-access-6rn4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.266742 4763 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.266777 4763 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-cache\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.266821 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.266833 4763 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ba72f8d4-1822-4bb5-a099-c15d4b00b701-lock\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.266844 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rn4x\" (UniqueName: \"kubernetes.io/projected/ba72f8d4-1822-4bb5-a099-c15d4b00b701-kube-api-access-6rn4x\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.283091 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.349127 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-72z5c_08cae05d-3853-4e7a-a66c-380c023d086b/ovs-vswitchd/0.log" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.350637 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-72z5c" event={"ID":"08cae05d-3853-4e7a-a66c-380c023d086b","Type":"ContainerDied","Data":"a70cd39f6185e0831b184e524834c5fcd081e8ec637941345560d79748162292"} Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.350703 4763 scope.go:117] "RemoveContainer" containerID="0efe21622d68a36b254482a2fb3c37cb814c61c5f1fb34bc7a8e9badfc15f145" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.350699 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-72z5c" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.359183 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ba72f8d4-1822-4bb5-a099-c15d4b00b701","Type":"ContainerDied","Data":"8396bfe7c730495814b6c31d1b6eec95410f28177e3bd00d50d87a223d392f14"} Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.359281 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.378538 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.378879 4763 scope.go:117] "RemoveContainer" containerID="276177b934a547e5b7897e2be6cc98c0cfbf2a425bd6e5e04aeba9d2a96ae264" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.395434 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-72z5c"] Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.412856 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.413773 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-72z5c"] Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.417750 4763 scope.go:117] "RemoveContainer" containerID="62bc8ec1bc27fbde74a1f9c030003027564bbbd612c1161e410a2129b5bc1b90" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.421620 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.431485 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.446817 4763 scope.go:117] "RemoveContainer" containerID="c52de0c97e78063fc806d8831c3e0f7eba864de7670f488d153c3e4e13e7df72" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.466702 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-26ghw"] Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.472419 4763 scope.go:117] "RemoveContainer" containerID="a2c552587c9daa3eff2f6b01626bdb8930edcce9d10cefa5e6f2138456bab7ae" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.491826 4763 scope.go:117] "RemoveContainer" containerID="2e13d0b0d4911e364ee6a3df6a55c9fe084a5532f8df7d0fcfa8239cfa1bd7d8" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.500973 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" path="/var/lib/kubelet/pods/08cae05d-3853-4e7a-a66c-380c023d086b/volumes" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.501629 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" path="/var/lib/kubelet/pods/ba72f8d4-1822-4bb5-a099-c15d4b00b701/volumes" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.510867 4763 scope.go:117] "RemoveContainer" containerID="68319e480d02549a9670870fb2b799e7a229e796a6e2e64c34a0f931f5c2294a" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.533971 4763 scope.go:117] "RemoveContainer" containerID="242dc53e835ef062c4e6ffb487f5cb2cd09de49af6b5aef18aae943dfe19b104" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.550176 4763 scope.go:117] "RemoveContainer" containerID="7af9dffe8b6aec08e0ffc071adb335564cdf7ec832db594c4069392c84f63460" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.566373 4763 scope.go:117] "RemoveContainer" containerID="da7282808861470139cef025a99057cbd65aa13cb0bfc0317356866852f5d03d" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.583760 4763 scope.go:117] "RemoveContainer" containerID="580d3253de72fffc16d0a36d6429d3d8a5a8907a3681e3d5a00508e74a43aeff" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.612642 4763 scope.go:117] "RemoveContainer" containerID="86fa7c116448649efc303d132f55a3b4d51ce4ff7728e8cb83a546ae8cc6be04" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.627879 4763 scope.go:117] "RemoveContainer" containerID="17bbda96e72abf0e4fc5b512a5a8c030ec54be5d2e9697b12e425013fc6e5674" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.644467 4763 scope.go:117] "RemoveContainer" containerID="e59cd93a64db4c4a2e52fd8dec840f2e642f02a5898d661bf7aeab73f09ef3a3" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.660585 4763 scope.go:117] "RemoveContainer" containerID="6913f1a8c201da716c04a9052d361c35e1f0beafd7a800065007dd41db8b8e2f" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.678864 4763 scope.go:117] "RemoveContainer" containerID="1bb4e132326be55cfb6d2c02cfd640df1ebca518cc39286f9fe76a41c347dda4" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.697788 4763 scope.go:117] "RemoveContainer" containerID="18e1cb42d1ac47e256e5579a10290ba641e04de49adb3a3798799607a90f1b1a" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.717306 4763 scope.go:117] "RemoveContainer" containerID="dfe4428a4ee91686c8b839dc094b2cea3d884fe055392d209003e50ad9cecb05" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.812456 4763 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podf3cc8ad7-1903-4a9f-94a4-a84f47cd1189"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podf3cc8ad7-1903-4a9f-94a4-a84f47cd1189] : Timed out while waiting for systemd to remove kubepods-besteffort-podf3cc8ad7_1903_4a9f_94a4_a84f47cd1189.slice" Sep 30 13:58:42 crc kubenswrapper[4763]: E0930 13:58:42.812519 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podf3cc8ad7-1903-4a9f-94a4-a84f47cd1189] : unable to destroy cgroup paths for cgroup [kubepods besteffort podf3cc8ad7-1903-4a9f-94a4-a84f47cd1189] : Timed out while waiting for systemd to remove kubepods-besteffort-podf3cc8ad7_1903_4a9f_94a4_a84f47cd1189.slice" pod="openstack/ovsdbserver-nb-0" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.819766 4763 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod98e98c9d-b727-4c5b-857b-13064b0ef92f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod98e98c9d-b727-4c5b-857b-13064b0ef92f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod98e98c9d_b727_4c5b_857b_13064b0ef92f.slice" Sep 30 13:58:42 crc kubenswrapper[4763]: E0930 13:58:42.819792 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod98e98c9d-b727-4c5b-857b-13064b0ef92f] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod98e98c9d-b727-4c5b-857b-13064b0ef92f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod98e98c9d_b727_4c5b_857b_13064b0ef92f.slice" pod="openstack/openstackclient" podUID="98e98c9d-b727-4c5b-857b-13064b0ef92f" Sep 30 13:58:42 crc kubenswrapper[4763]: I0930 13:58:42.968940 4763 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod03f3de76-2dd7-4d26-8010-72d5ff408190"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod03f3de76-2dd7-4d26-8010-72d5ff408190] : Timed out while waiting for systemd to remove kubepods-besteffort-pod03f3de76_2dd7_4d26_8010_72d5ff408190.slice" Sep 30 13:58:42 crc kubenswrapper[4763]: E0930 13:58:42.969201 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod03f3de76-2dd7-4d26-8010-72d5ff408190] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod03f3de76-2dd7-4d26-8010-72d5ff408190] : Timed out while waiting for systemd to remove kubepods-besteffort-pod03f3de76_2dd7_4d26_8010_72d5ff408190.slice" pod="openstack/ovn-controller-metrics-djfwj" podUID="03f3de76-2dd7-4d26-8010-72d5ff408190" Sep 30 13:58:43 crc kubenswrapper[4763]: I0930 13:58:43.373663 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-djfwj" Sep 30 13:58:43 crc kubenswrapper[4763]: I0930 13:58:43.374019 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 13:58:43 crc kubenswrapper[4763]: I0930 13:58:43.375647 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 13:58:43 crc kubenswrapper[4763]: I0930 13:58:43.410715 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-djfwj"] Sep 30 13:58:43 crc kubenswrapper[4763]: I0930 13:58:43.414406 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-djfwj"] Sep 30 13:58:43 crc kubenswrapper[4763]: I0930 13:58:43.439773 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 13:58:43 crc kubenswrapper[4763]: I0930 13:58:43.446945 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 13:58:44 crc kubenswrapper[4763]: I0930 13:58:44.380283 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-26ghw" podUID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerName="registry-server" containerID="cri-o://120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45" gracePeriod=2 Sep 30 13:58:44 crc kubenswrapper[4763]: I0930 13:58:44.501976 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f3de76-2dd7-4d26-8010-72d5ff408190" path="/var/lib/kubelet/pods/03f3de76-2dd7-4d26-8010-72d5ff408190/volumes" Sep 30 13:58:44 crc kubenswrapper[4763]: I0930 13:58:44.503053 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3cc8ad7-1903-4a9f-94a4-a84f47cd1189" path="/var/lib/kubelet/pods/f3cc8ad7-1903-4a9f-94a4-a84f47cd1189/volumes" Sep 30 13:58:44 crc kubenswrapper[4763]: I0930 13:58:44.901710 4763 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podcf9f1fd7-72d5-4f11-b8c8-5e941597ca75"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podcf9f1fd7-72d5-4f11-b8c8-5e941597ca75] : Timed out while waiting for systemd to remove kubepods-besteffort-podcf9f1fd7_72d5_4f11_b8c8_5e941597ca75.slice" Sep 30 13:58:44 crc kubenswrapper[4763]: E0930 13:58:44.901789 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podcf9f1fd7-72d5-4f11-b8c8-5e941597ca75] : unable to destroy cgroup paths for cgroup [kubepods besteffort podcf9f1fd7-72d5-4f11-b8c8-5e941597ca75] : Timed out while waiting for systemd to remove kubepods-besteffort-podcf9f1fd7_72d5_4f11_b8c8_5e941597ca75.slice" pod="openstack/glance9e56-account-delete-4pzvl" podUID="cf9f1fd7-72d5-4f11-b8c8-5e941597ca75" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.347455 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.392086 4763 generic.go:334] "Generic (PLEG): container finished" podID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerID="120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45" exitCode=0 Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.392143 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26ghw" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.392161 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance9e56-account-delete-4pzvl" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.392154 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26ghw" event={"ID":"3359bd24-0be4-4329-b72e-2a044567c7e9","Type":"ContainerDied","Data":"120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45"} Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.392203 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26ghw" event={"ID":"3359bd24-0be4-4329-b72e-2a044567c7e9","Type":"ContainerDied","Data":"12fdea992ac6f926c820bad70343e1abe8871bc6452a34de2c68eb6ccc082921"} Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.392226 4763 scope.go:117] "RemoveContainer" containerID="120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.412925 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance9e56-account-delete-4pzvl"] Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.414282 4763 scope.go:117] "RemoveContainer" containerID="5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.417990 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance9e56-account-delete-4pzvl"] Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.452908 4763 scope.go:117] "RemoveContainer" containerID="4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.470437 4763 scope.go:117] "RemoveContainer" containerID="120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45" Sep 30 13:58:45 crc kubenswrapper[4763]: E0930 13:58:45.471023 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45\": container with ID starting with 120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45 not found: ID does not exist" containerID="120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.471088 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45"} err="failed to get container status \"120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45\": rpc error: code = NotFound desc = could not find container \"120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45\": container with ID starting with 120595c8202f3a1727d9f6eaeb5ed0db881463626387ccdc84e26dcd3bfc7a45 not found: ID does not exist" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.471120 4763 scope.go:117] "RemoveContainer" containerID="5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94" Sep 30 13:58:45 crc kubenswrapper[4763]: E0930 13:58:45.471457 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94\": container with ID starting with 5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94 not found: ID does not exist" containerID="5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.471491 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94"} err="failed to get container status \"5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94\": rpc error: code = NotFound desc = could not find container \"5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94\": container with ID starting with 5d45e172e84642e1c8c2c4b6a7ce8267c778f386d1be7f4945af15e4a935ab94 not found: ID does not exist" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.471511 4763 scope.go:117] "RemoveContainer" containerID="4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b" Sep 30 13:58:45 crc kubenswrapper[4763]: E0930 13:58:45.471747 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b\": container with ID starting with 4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b not found: ID does not exist" containerID="4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.471773 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b"} err="failed to get container status \"4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b\": rpc error: code = NotFound desc = could not find container \"4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b\": container with ID starting with 4f34ba7fbf58c60c557ae337196b5485ad8c4f77722e59b317b3b15c6e53046b not found: ID does not exist" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.522769 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb9ws\" (UniqueName: \"kubernetes.io/projected/3359bd24-0be4-4329-b72e-2a044567c7e9-kube-api-access-pb9ws\") pod \"3359bd24-0be4-4329-b72e-2a044567c7e9\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.522838 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-catalog-content\") pod \"3359bd24-0be4-4329-b72e-2a044567c7e9\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.522937 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-utilities\") pod \"3359bd24-0be4-4329-b72e-2a044567c7e9\" (UID: \"3359bd24-0be4-4329-b72e-2a044567c7e9\") " Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.524111 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-utilities" (OuterVolumeSpecName: "utilities") pod "3359bd24-0be4-4329-b72e-2a044567c7e9" (UID: "3359bd24-0be4-4329-b72e-2a044567c7e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.527944 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3359bd24-0be4-4329-b72e-2a044567c7e9-kube-api-access-pb9ws" (OuterVolumeSpecName: "kube-api-access-pb9ws") pod "3359bd24-0be4-4329-b72e-2a044567c7e9" (UID: "3359bd24-0be4-4329-b72e-2a044567c7e9"). InnerVolumeSpecName "kube-api-access-pb9ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.543590 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3359bd24-0be4-4329-b72e-2a044567c7e9" (UID: "3359bd24-0be4-4329-b72e-2a044567c7e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.624666 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.625366 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3359bd24-0be4-4329-b72e-2a044567c7e9-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.625387 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb9ws\" (UniqueName: \"kubernetes.io/projected/3359bd24-0be4-4329-b72e-2a044567c7e9-kube-api-access-pb9ws\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.732230 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-26ghw"] Sep 30 13:58:45 crc kubenswrapper[4763]: I0930 13:58:45.739872 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-26ghw"] Sep 30 13:58:46 crc kubenswrapper[4763]: I0930 13:58:46.499960 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3359bd24-0be4-4329-b72e-2a044567c7e9" path="/var/lib/kubelet/pods/3359bd24-0be4-4329-b72e-2a044567c7e9/volumes" Sep 30 13:58:46 crc kubenswrapper[4763]: I0930 13:58:46.500808 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9f1fd7-72d5-4f11-b8c8-5e941597ca75" path="/var/lib/kubelet/pods/cf9f1fd7-72d5-4f11-b8c8-5e941597ca75/volumes" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.766268 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gx2hg"] Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767136 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-replicator" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767156 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-replicator" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767180 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-replicator" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767187 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-replicator" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767200 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-expirer" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767209 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-expirer" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767223 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-replicator" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767230 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-replicator" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767245 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-server" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767253 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-server" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767268 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-updater" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767274 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-updater" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767288 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-auditor" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767297 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-auditor" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767311 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="rsync" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767319 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="rsync" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767335 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-server" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767343 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-server" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767354 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-updater" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767361 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-updater" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767376 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="swift-recon-cron" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767383 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="swift-recon-cron" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767397 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-auditor" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767404 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-auditor" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767419 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerName="extract-utilities" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767426 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerName="extract-utilities" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767436 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-server" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767444 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-server" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767457 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-reaper" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767464 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-reaper" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767476 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerName="extract-content" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767485 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerName="extract-content" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767495 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerName="registry-server" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767502 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerName="registry-server" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767513 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-auditor" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767520 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-auditor" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767529 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767536 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767549 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server-init" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767556 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server-init" Sep 30 13:59:19 crc kubenswrapper[4763]: E0930 13:59:19.767567 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovs-vswitchd" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767575 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovs-vswitchd" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767768 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-server" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767784 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-expirer" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767799 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-replicator" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767811 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-replicator" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767825 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3359bd24-0be4-4329-b72e-2a044567c7e9" containerName="registry-server" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767841 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovsdb-server" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767853 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-server" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767865 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="rsync" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767876 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-updater" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767890 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-replicator" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767902 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-server" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767911 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="swift-recon-cron" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767920 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="08cae05d-3853-4e7a-a66c-380c023d086b" containerName="ovs-vswitchd" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767928 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-auditor" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767942 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="container-updater" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767951 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="object-auditor" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767961 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-auditor" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.767973 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba72f8d4-1822-4bb5-a099-c15d4b00b701" containerName="account-reaper" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.769061 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.778984 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gx2hg"] Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.859903 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmpj\" (UniqueName: \"kubernetes.io/projected/56812ab5-144f-44de-9a79-b11208d66273-kube-api-access-9wmpj\") pod \"certified-operators-gx2hg\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.860319 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-catalog-content\") pod \"certified-operators-gx2hg\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.860374 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-utilities\") pod \"certified-operators-gx2hg\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.961805 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-utilities\") pod \"certified-operators-gx2hg\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.961970 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmpj\" (UniqueName: \"kubernetes.io/projected/56812ab5-144f-44de-9a79-b11208d66273-kube-api-access-9wmpj\") pod \"certified-operators-gx2hg\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.962042 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-catalog-content\") pod \"certified-operators-gx2hg\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.962487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-utilities\") pod \"certified-operators-gx2hg\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.962826 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-catalog-content\") pod \"certified-operators-gx2hg\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:19 crc kubenswrapper[4763]: I0930 13:59:19.994195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmpj\" (UniqueName: \"kubernetes.io/projected/56812ab5-144f-44de-9a79-b11208d66273-kube-api-access-9wmpj\") pod \"certified-operators-gx2hg\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:20 crc kubenswrapper[4763]: I0930 13:59:20.094018 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:20 crc kubenswrapper[4763]: I0930 13:59:20.587187 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gx2hg"] Sep 30 13:59:20 crc kubenswrapper[4763]: I0930 13:59:20.713879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gx2hg" event={"ID":"56812ab5-144f-44de-9a79-b11208d66273","Type":"ContainerStarted","Data":"ed584fbb7fbb2ed6173693e85d1a3e3353e0c9ac05339279d7c40661c7ad29a0"} Sep 30 13:59:21 crc kubenswrapper[4763]: I0930 13:59:21.723114 4763 generic.go:334] "Generic (PLEG): container finished" podID="56812ab5-144f-44de-9a79-b11208d66273" containerID="2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14" exitCode=0 Sep 30 13:59:21 crc kubenswrapper[4763]: I0930 13:59:21.723195 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gx2hg" event={"ID":"56812ab5-144f-44de-9a79-b11208d66273","Type":"ContainerDied","Data":"2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14"} Sep 30 13:59:22 crc kubenswrapper[4763]: I0930 13:59:22.734618 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gx2hg" event={"ID":"56812ab5-144f-44de-9a79-b11208d66273","Type":"ContainerStarted","Data":"24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e"} Sep 30 13:59:23 crc kubenswrapper[4763]: I0930 13:59:23.744142 4763 generic.go:334] "Generic (PLEG): container finished" podID="56812ab5-144f-44de-9a79-b11208d66273" containerID="24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e" exitCode=0 Sep 30 13:59:23 crc kubenswrapper[4763]: I0930 13:59:23.744199 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gx2hg" event={"ID":"56812ab5-144f-44de-9a79-b11208d66273","Type":"ContainerDied","Data":"24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e"} Sep 30 13:59:24 crc kubenswrapper[4763]: I0930 13:59:24.756055 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gx2hg" event={"ID":"56812ab5-144f-44de-9a79-b11208d66273","Type":"ContainerStarted","Data":"c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51"} Sep 30 13:59:24 crc kubenswrapper[4763]: I0930 13:59:24.775134 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gx2hg" podStartSLOduration=3.252684727 podStartE2EDuration="5.775112661s" podCreationTimestamp="2025-09-30 13:59:19 +0000 UTC" firstStartedPulling="2025-09-30 13:59:21.726044965 +0000 UTC m=+1433.864605270" lastFinishedPulling="2025-09-30 13:59:24.248472909 +0000 UTC m=+1436.387033204" observedRunningTime="2025-09-30 13:59:24.772488135 +0000 UTC m=+1436.911048430" watchObservedRunningTime="2025-09-30 13:59:24.775112661 +0000 UTC m=+1436.913672946" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.094658 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.094972 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.139978 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.546808 4763 scope.go:117] "RemoveContainer" containerID="1e04a4e066c062d270cdfe94173bdc4d2b4a9790730eae432b09da3c6571223b" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.581443 4763 scope.go:117] "RemoveContainer" containerID="f6e4f42f53a2bc5ce14b714651684ec859fa6c8501a49e587b1165761bb42457" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.629340 4763 scope.go:117] "RemoveContainer" containerID="a4f61c64a8df3d9915add4b261e934b36d4aae625742c1aa68894904c7c207d4" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.652823 4763 scope.go:117] "RemoveContainer" containerID="f7ddd2b0fb1d49dfc38face49b6897b54965d13f08f04b6ddbd116bb6356c07b" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.669583 4763 scope.go:117] "RemoveContainer" containerID="0d0112a1787094253153b3a60f8663407e6cb545baa23acb0b8b21ec9335b321" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.722380 4763 scope.go:117] "RemoveContainer" containerID="ed2ff9ff9884e55f31010027f8a058be4e0e73d12107dca3cb5d5fa374a21077" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.831730 4763 scope.go:117] "RemoveContainer" containerID="c361eaf5f095ffaa66bf4d8a6a114f837e677ebb597ac00b0ddebf3497844457" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.852026 4763 scope.go:117] "RemoveContainer" containerID="8d9e3ab86dc859f16e88025097f97a98ba29d69374fe2446837e45205f560afd" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.858545 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.875959 4763 scope.go:117] "RemoveContainer" containerID="a6ea38cc2fc406d3ccc0ad4f22050e6ab3828a4bcfdcdf48337ef33e08b35c48" Sep 30 13:59:30 crc kubenswrapper[4763]: I0930 13:59:30.928286 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gx2hg"] Sep 30 13:59:32 crc kubenswrapper[4763]: I0930 13:59:32.829572 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gx2hg" podUID="56812ab5-144f-44de-9a79-b11208d66273" containerName="registry-server" containerID="cri-o://c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51" gracePeriod=2 Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.103187 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8s6xt"] Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.105204 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.122356 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s6xt"] Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.241299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-catalog-content\") pod \"community-operators-8s6xt\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.241360 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnkgt\" (UniqueName: \"kubernetes.io/projected/e86304fb-a02e-4650-b7cf-ab001ea66437-kube-api-access-lnkgt\") pod \"community-operators-8s6xt\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.241382 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-utilities\") pod \"community-operators-8s6xt\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.342474 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-catalog-content\") pod \"community-operators-8s6xt\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.342535 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnkgt\" (UniqueName: \"kubernetes.io/projected/e86304fb-a02e-4650-b7cf-ab001ea66437-kube-api-access-lnkgt\") pod \"community-operators-8s6xt\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.342555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-utilities\") pod \"community-operators-8s6xt\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.343068 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-catalog-content\") pod \"community-operators-8s6xt\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.343111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-utilities\") pod \"community-operators-8s6xt\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.361074 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnkgt\" (UniqueName: \"kubernetes.io/projected/e86304fb-a02e-4650-b7cf-ab001ea66437-kube-api-access-lnkgt\") pod \"community-operators-8s6xt\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.440822 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.793683 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.840586 4763 generic.go:334] "Generic (PLEG): container finished" podID="56812ab5-144f-44de-9a79-b11208d66273" containerID="c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51" exitCode=0 Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.840657 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gx2hg" event={"ID":"56812ab5-144f-44de-9a79-b11208d66273","Type":"ContainerDied","Data":"c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51"} Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.840696 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gx2hg" event={"ID":"56812ab5-144f-44de-9a79-b11208d66273","Type":"ContainerDied","Data":"ed584fbb7fbb2ed6173693e85d1a3e3353e0c9ac05339279d7c40661c7ad29a0"} Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.840722 4763 scope.go:117] "RemoveContainer" containerID="c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.840787 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gx2hg" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.868513 4763 scope.go:117] "RemoveContainer" containerID="24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.886655 4763 scope.go:117] "RemoveContainer" containerID="2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.902252 4763 scope.go:117] "RemoveContainer" containerID="c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51" Sep 30 13:59:33 crc kubenswrapper[4763]: E0930 13:59:33.902695 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51\": container with ID starting with c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51 not found: ID does not exist" containerID="c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.902737 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51"} err="failed to get container status \"c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51\": rpc error: code = NotFound desc = could not find container \"c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51\": container with ID starting with c1053d9fc1080f456293ba799212ebc4457fbe8f04852c3e5914dad375531d51 not found: ID does not exist" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.902765 4763 scope.go:117] "RemoveContainer" containerID="24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e" Sep 30 13:59:33 crc kubenswrapper[4763]: E0930 13:59:33.903039 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e\": container with ID starting with 24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e not found: ID does not exist" containerID="24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.903063 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e"} err="failed to get container status \"24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e\": rpc error: code = NotFound desc = could not find container \"24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e\": container with ID starting with 24d7496e571be5ea04465f8e679b373796f0529a88665d50447afb74d4e5da6e not found: ID does not exist" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.903077 4763 scope.go:117] "RemoveContainer" containerID="2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14" Sep 30 13:59:33 crc kubenswrapper[4763]: E0930 13:59:33.903348 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14\": container with ID starting with 2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14 not found: ID does not exist" containerID="2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.903384 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14"} err="failed to get container status \"2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14\": rpc error: code = NotFound desc = could not find container \"2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14\": container with ID starting with 2ba70449ab4cac88ef2ee00c2196341af363b193f19f4ab42d209168194bea14 not found: ID does not exist" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.950183 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-utilities\") pod \"56812ab5-144f-44de-9a79-b11208d66273\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.950241 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wmpj\" (UniqueName: \"kubernetes.io/projected/56812ab5-144f-44de-9a79-b11208d66273-kube-api-access-9wmpj\") pod \"56812ab5-144f-44de-9a79-b11208d66273\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.950347 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-catalog-content\") pod \"56812ab5-144f-44de-9a79-b11208d66273\" (UID: \"56812ab5-144f-44de-9a79-b11208d66273\") " Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.952446 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-utilities" (OuterVolumeSpecName: "utilities") pod "56812ab5-144f-44de-9a79-b11208d66273" (UID: "56812ab5-144f-44de-9a79-b11208d66273"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.959398 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56812ab5-144f-44de-9a79-b11208d66273-kube-api-access-9wmpj" (OuterVolumeSpecName: "kube-api-access-9wmpj") pod "56812ab5-144f-44de-9a79-b11208d66273" (UID: "56812ab5-144f-44de-9a79-b11208d66273"). InnerVolumeSpecName "kube-api-access-9wmpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.995009 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s6xt"] Sep 30 13:59:33 crc kubenswrapper[4763]: I0930 13:59:33.998813 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56812ab5-144f-44de-9a79-b11208d66273" (UID: "56812ab5-144f-44de-9a79-b11208d66273"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:34 crc kubenswrapper[4763]: I0930 13:59:34.052287 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:34 crc kubenswrapper[4763]: I0930 13:59:34.052334 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56812ab5-144f-44de-9a79-b11208d66273-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:34 crc kubenswrapper[4763]: I0930 13:59:34.052347 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wmpj\" (UniqueName: \"kubernetes.io/projected/56812ab5-144f-44de-9a79-b11208d66273-kube-api-access-9wmpj\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:34 crc kubenswrapper[4763]: I0930 13:59:34.175666 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gx2hg"] Sep 30 13:59:34 crc kubenswrapper[4763]: I0930 13:59:34.181265 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gx2hg"] Sep 30 13:59:34 crc kubenswrapper[4763]: I0930 13:59:34.500781 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56812ab5-144f-44de-9a79-b11208d66273" path="/var/lib/kubelet/pods/56812ab5-144f-44de-9a79-b11208d66273/volumes" Sep 30 13:59:34 crc kubenswrapper[4763]: I0930 13:59:34.849423 4763 generic.go:334] "Generic (PLEG): container finished" podID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerID="fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b" exitCode=0 Sep 30 13:59:34 crc kubenswrapper[4763]: I0930 13:59:34.849492 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s6xt" event={"ID":"e86304fb-a02e-4650-b7cf-ab001ea66437","Type":"ContainerDied","Data":"fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b"} Sep 30 13:59:34 crc kubenswrapper[4763]: I0930 13:59:34.851410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s6xt" event={"ID":"e86304fb-a02e-4650-b7cf-ab001ea66437","Type":"ContainerStarted","Data":"085c4260bfc2f4264262193906b7bc9f63f5c1474653768bf88f8a8a62857383"} Sep 30 13:59:36 crc kubenswrapper[4763]: I0930 13:59:36.059744 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:59:36 crc kubenswrapper[4763]: I0930 13:59:36.060046 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:59:36 crc kubenswrapper[4763]: I0930 13:59:36.870195 4763 generic.go:334] "Generic (PLEG): container finished" podID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerID="64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9" exitCode=0 Sep 30 13:59:36 crc kubenswrapper[4763]: I0930 13:59:36.870259 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s6xt" event={"ID":"e86304fb-a02e-4650-b7cf-ab001ea66437","Type":"ContainerDied","Data":"64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9"} Sep 30 13:59:37 crc kubenswrapper[4763]: I0930 13:59:37.883562 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s6xt" event={"ID":"e86304fb-a02e-4650-b7cf-ab001ea66437","Type":"ContainerStarted","Data":"5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42"} Sep 30 13:59:37 crc kubenswrapper[4763]: I0930 13:59:37.900491 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8s6xt" podStartSLOduration=2.250004889 podStartE2EDuration="4.900470961s" podCreationTimestamp="2025-09-30 13:59:33 +0000 UTC" firstStartedPulling="2025-09-30 13:59:34.850704839 +0000 UTC m=+1446.989265114" lastFinishedPulling="2025-09-30 13:59:37.501170891 +0000 UTC m=+1449.639731186" observedRunningTime="2025-09-30 13:59:37.900144283 +0000 UTC m=+1450.038704588" watchObservedRunningTime="2025-09-30 13:59:37.900470961 +0000 UTC m=+1450.039031246" Sep 30 13:59:43 crc kubenswrapper[4763]: I0930 13:59:43.441958 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:43 crc kubenswrapper[4763]: I0930 13:59:43.442487 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:43 crc kubenswrapper[4763]: I0930 13:59:43.486223 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:43 crc kubenswrapper[4763]: I0930 13:59:43.968232 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:44 crc kubenswrapper[4763]: I0930 13:59:44.020570 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8s6xt"] Sep 30 13:59:45 crc kubenswrapper[4763]: I0930 13:59:45.947380 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8s6xt" podUID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerName="registry-server" containerID="cri-o://5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42" gracePeriod=2 Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.867783 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.941845 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnkgt\" (UniqueName: \"kubernetes.io/projected/e86304fb-a02e-4650-b7cf-ab001ea66437-kube-api-access-lnkgt\") pod \"e86304fb-a02e-4650-b7cf-ab001ea66437\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.941937 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-utilities\") pod \"e86304fb-a02e-4650-b7cf-ab001ea66437\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.942031 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-catalog-content\") pod \"e86304fb-a02e-4650-b7cf-ab001ea66437\" (UID: \"e86304fb-a02e-4650-b7cf-ab001ea66437\") " Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.943093 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-utilities" (OuterVolumeSpecName: "utilities") pod "e86304fb-a02e-4650-b7cf-ab001ea66437" (UID: "e86304fb-a02e-4650-b7cf-ab001ea66437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.948500 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86304fb-a02e-4650-b7cf-ab001ea66437-kube-api-access-lnkgt" (OuterVolumeSpecName: "kube-api-access-lnkgt") pod "e86304fb-a02e-4650-b7cf-ab001ea66437" (UID: "e86304fb-a02e-4650-b7cf-ab001ea66437"). InnerVolumeSpecName "kube-api-access-lnkgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.959236 4763 generic.go:334] "Generic (PLEG): container finished" podID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerID="5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42" exitCode=0 Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.959434 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s6xt" event={"ID":"e86304fb-a02e-4650-b7cf-ab001ea66437","Type":"ContainerDied","Data":"5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42"} Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.959684 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s6xt" event={"ID":"e86304fb-a02e-4650-b7cf-ab001ea66437","Type":"ContainerDied","Data":"085c4260bfc2f4264262193906b7bc9f63f5c1474653768bf88f8a8a62857383"} Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.959509 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s6xt" Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.959815 4763 scope.go:117] "RemoveContainer" containerID="5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42" Sep 30 13:59:46 crc kubenswrapper[4763]: I0930 13:59:46.983318 4763 scope.go:117] "RemoveContainer" containerID="64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.001515 4763 scope.go:117] "RemoveContainer" containerID="fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.032727 4763 scope.go:117] "RemoveContainer" containerID="5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42" Sep 30 13:59:47 crc kubenswrapper[4763]: E0930 13:59:47.033314 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42\": container with ID starting with 5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42 not found: ID does not exist" containerID="5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.033366 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42"} err="failed to get container status \"5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42\": rpc error: code = NotFound desc = could not find container \"5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42\": container with ID starting with 5e68f89fb381774393e8957a4b5b84587b9df256226e6b4a6de4dd4f736bad42 not found: ID does not exist" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.033392 4763 scope.go:117] "RemoveContainer" containerID="64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9" Sep 30 13:59:47 crc kubenswrapper[4763]: E0930 13:59:47.033970 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9\": container with ID starting with 64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9 not found: ID does not exist" containerID="64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.034024 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9"} err="failed to get container status \"64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9\": rpc error: code = NotFound desc = could not find container \"64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9\": container with ID starting with 64c1d93b0d1201441b73b8375e6f40f369a17161c641c558e121170b48f2aff9 not found: ID does not exist" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.034054 4763 scope.go:117] "RemoveContainer" containerID="fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b" Sep 30 13:59:47 crc kubenswrapper[4763]: E0930 13:59:47.034330 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b\": container with ID starting with fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b not found: ID does not exist" containerID="fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.034357 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b"} err="failed to get container status \"fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b\": rpc error: code = NotFound desc = could not find container \"fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b\": container with ID starting with fa43c2236affda82cec145f38c3e46d4c3f36a6cbbb6250a2a3ad65c6abfeb3b not found: ID does not exist" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.043496 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnkgt\" (UniqueName: \"kubernetes.io/projected/e86304fb-a02e-4650-b7cf-ab001ea66437-kube-api-access-lnkgt\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.043535 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.239363 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e86304fb-a02e-4650-b7cf-ab001ea66437" (UID: "e86304fb-a02e-4650-b7cf-ab001ea66437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.245529 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86304fb-a02e-4650-b7cf-ab001ea66437-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.295900 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8s6xt"] Sep 30 13:59:47 crc kubenswrapper[4763]: I0930 13:59:47.309626 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8s6xt"] Sep 30 13:59:48 crc kubenswrapper[4763]: I0930 13:59:48.503075 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86304fb-a02e-4650-b7cf-ab001ea66437" path="/var/lib/kubelet/pods/e86304fb-a02e-4650-b7cf-ab001ea66437/volumes" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.157450 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv"] Sep 30 14:00:00 crc kubenswrapper[4763]: E0930 14:00:00.158371 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56812ab5-144f-44de-9a79-b11208d66273" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.158388 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="56812ab5-144f-44de-9a79-b11208d66273" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4763]: E0930 14:00:00.158414 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.158421 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4763]: E0930 14:00:00.158436 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerName="extract-utilities" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.158445 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerName="extract-utilities" Sep 30 14:00:00 crc kubenswrapper[4763]: E0930 14:00:00.158457 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56812ab5-144f-44de-9a79-b11208d66273" containerName="extract-utilities" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.158464 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="56812ab5-144f-44de-9a79-b11208d66273" containerName="extract-utilities" Sep 30 14:00:00 crc kubenswrapper[4763]: E0930 14:00:00.158487 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56812ab5-144f-44de-9a79-b11208d66273" containerName="extract-content" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.158495 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="56812ab5-144f-44de-9a79-b11208d66273" containerName="extract-content" Sep 30 14:00:00 crc kubenswrapper[4763]: E0930 14:00:00.158514 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerName="extract-content" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.158521 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerName="extract-content" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.158712 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="56812ab5-144f-44de-9a79-b11208d66273" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.158731 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86304fb-a02e-4650-b7cf-ab001ea66437" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.159363 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.162440 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.162660 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.167424 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv"] Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.232460 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-config-volume\") pod \"collect-profiles-29320680-6pfcv\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.232581 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-secret-volume\") pod \"collect-profiles-29320680-6pfcv\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.232643 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wpl\" (UniqueName: \"kubernetes.io/projected/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-kube-api-access-r7wpl\") pod \"collect-profiles-29320680-6pfcv\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.333652 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-config-volume\") pod \"collect-profiles-29320680-6pfcv\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.333978 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-secret-volume\") pod \"collect-profiles-29320680-6pfcv\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.334072 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wpl\" (UniqueName: \"kubernetes.io/projected/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-kube-api-access-r7wpl\") pod \"collect-profiles-29320680-6pfcv\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.334555 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-config-volume\") pod \"collect-profiles-29320680-6pfcv\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.343179 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-secret-volume\") pod \"collect-profiles-29320680-6pfcv\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.350798 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wpl\" (UniqueName: \"kubernetes.io/projected/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-kube-api-access-r7wpl\") pod \"collect-profiles-29320680-6pfcv\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.495568 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:00 crc kubenswrapper[4763]: I0930 14:00:00.928548 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv"] Sep 30 14:00:01 crc kubenswrapper[4763]: I0930 14:00:01.089552 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" event={"ID":"2317d90b-e233-4f27-b3bc-f60b3aaea8ea","Type":"ContainerStarted","Data":"c148373cf0ba27282bfb3ea709f44cdae71768ad8f8f6c2ba06fa88bb834621f"} Sep 30 14:00:02 crc kubenswrapper[4763]: I0930 14:00:02.098286 4763 generic.go:334] "Generic (PLEG): container finished" podID="2317d90b-e233-4f27-b3bc-f60b3aaea8ea" containerID="84edac1b690c46b42d3b51685afd16e7387e24b36a53232269097482ee924af2" exitCode=0 Sep 30 14:00:02 crc kubenswrapper[4763]: I0930 14:00:02.098331 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" event={"ID":"2317d90b-e233-4f27-b3bc-f60b3aaea8ea","Type":"ContainerDied","Data":"84edac1b690c46b42d3b51685afd16e7387e24b36a53232269097482ee924af2"} Sep 30 14:00:03 crc kubenswrapper[4763]: I0930 14:00:03.365509 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:03 crc kubenswrapper[4763]: I0930 14:00:03.483688 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-secret-volume\") pod \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " Sep 30 14:00:03 crc kubenswrapper[4763]: I0930 14:00:03.483843 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7wpl\" (UniqueName: \"kubernetes.io/projected/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-kube-api-access-r7wpl\") pod \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " Sep 30 14:00:03 crc kubenswrapper[4763]: I0930 14:00:03.483871 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-config-volume\") pod \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\" (UID: \"2317d90b-e233-4f27-b3bc-f60b3aaea8ea\") " Sep 30 14:00:03 crc kubenswrapper[4763]: I0930 14:00:03.484513 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-config-volume" (OuterVolumeSpecName: "config-volume") pod "2317d90b-e233-4f27-b3bc-f60b3aaea8ea" (UID: "2317d90b-e233-4f27-b3bc-f60b3aaea8ea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:00:03 crc kubenswrapper[4763]: I0930 14:00:03.489874 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-kube-api-access-r7wpl" (OuterVolumeSpecName: "kube-api-access-r7wpl") pod "2317d90b-e233-4f27-b3bc-f60b3aaea8ea" (UID: "2317d90b-e233-4f27-b3bc-f60b3aaea8ea"). InnerVolumeSpecName "kube-api-access-r7wpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:03 crc kubenswrapper[4763]: I0930 14:00:03.490475 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2317d90b-e233-4f27-b3bc-f60b3aaea8ea" (UID: "2317d90b-e233-4f27-b3bc-f60b3aaea8ea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:03 crc kubenswrapper[4763]: I0930 14:00:03.584953 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:03 crc kubenswrapper[4763]: I0930 14:00:03.584990 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7wpl\" (UniqueName: \"kubernetes.io/projected/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-kube-api-access-r7wpl\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:03 crc kubenswrapper[4763]: I0930 14:00:03.585001 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2317d90b-e233-4f27-b3bc-f60b3aaea8ea-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:04 crc kubenswrapper[4763]: I0930 14:00:04.116907 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" event={"ID":"2317d90b-e233-4f27-b3bc-f60b3aaea8ea","Type":"ContainerDied","Data":"c148373cf0ba27282bfb3ea709f44cdae71768ad8f8f6c2ba06fa88bb834621f"} Sep 30 14:00:04 crc kubenswrapper[4763]: I0930 14:00:04.116946 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv" Sep 30 14:00:04 crc kubenswrapper[4763]: I0930 14:00:04.116964 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c148373cf0ba27282bfb3ea709f44cdae71768ad8f8f6c2ba06fa88bb834621f" Sep 30 14:00:06 crc kubenswrapper[4763]: I0930 14:00:06.060433 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:00:06 crc kubenswrapper[4763]: I0930 14:00:06.060798 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.088913 4763 scope.go:117] "RemoveContainer" containerID="ed161b44111ccbcbdc21277ce31dee7a2c0f706c282e5185255018b02b3a8ce8" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.117142 4763 scope.go:117] "RemoveContainer" containerID="0bd45db8372dd13c7f8b6af5a8f009d51c1365ea9b4c411b536ad6bd5ed5a9de" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.157781 4763 scope.go:117] "RemoveContainer" containerID="b8ccf04d26a147ca6c7ea6f6c38cf982b002fc378fcf0c8ce0053f5a185448b5" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.171491 4763 scope.go:117] "RemoveContainer" containerID="5529dafb50ba26bbc36ba32edb787859ab716d780082a8b3e8be2e416c1a2e80" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.209521 4763 scope.go:117] "RemoveContainer" containerID="482012e4eb14524c218a4c31ad892d834468e3f45f8db3a12de6d337c52a9f32" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.233401 4763 scope.go:117] "RemoveContainer" containerID="5e829dbda104a31e4cd527c5f5bbba0452beccd647814b138c2474881de0df51" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.280536 4763 scope.go:117] "RemoveContainer" containerID="869a8cdf28fb93309ac2159cf23a98dda8d452f10c75857366f1baeb925873e1" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.305274 4763 scope.go:117] "RemoveContainer" containerID="fd79bb983a126745103e9f7ac0ec7a6f63a3419ae638e5a9063e7fdc0f374214" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.335762 4763 scope.go:117] "RemoveContainer" containerID="170b6b6c50e40279bb612fcdfa9b24046c60eb591063778e915b4267e9aacc17" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.353783 4763 scope.go:117] "RemoveContainer" containerID="2988167e296d4244bebecb8ac04deccc2b6566a6c43096907b36078148436d85" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.376010 4763 scope.go:117] "RemoveContainer" containerID="ab5560070012ef542fb548a412733c5149604a4b26330c36843bead4e81247e0" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.393781 4763 scope.go:117] "RemoveContainer" containerID="e2e055db9fefff8aff0fd24fed8b68b4440bb95925f9d67dbbb38fcff4a3852f" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.412307 4763 scope.go:117] "RemoveContainer" containerID="d5c0dbee3becae192bb8e52217ee73cb5863f82428aefff98191c654a4fd0735" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.442851 4763 scope.go:117] "RemoveContainer" containerID="f1e0864f632bb09307785cf5234b3bb987c83f38ba4012bfdeaef8121a56adfe" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.463819 4763 scope.go:117] "RemoveContainer" containerID="6d69d05182fb0eb1fcd6f0067fd4d61b3fcc60f4c6a09fab437953f0bbf152dc" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.485155 4763 scope.go:117] "RemoveContainer" containerID="29a057f7e3e5df0f8f3a921b57ba3864cd15c6c5a6e6543c79fa63c986c34576" Sep 30 14:00:31 crc kubenswrapper[4763]: I0930 14:00:31.503819 4763 scope.go:117] "RemoveContainer" containerID="72f6ad706fff18c900a95959f4f354c1438166c8053468b7bedfc284a8853597" Sep 30 14:00:36 crc kubenswrapper[4763]: I0930 14:00:36.060166 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:00:36 crc kubenswrapper[4763]: I0930 14:00:36.060875 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:00:36 crc kubenswrapper[4763]: I0930 14:00:36.060945 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:00:36 crc kubenswrapper[4763]: I0930 14:00:36.061585 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:00:36 crc kubenswrapper[4763]: I0930 14:00:36.061675 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" gracePeriod=600 Sep 30 14:00:36 crc kubenswrapper[4763]: E0930 14:00:36.194716 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:00:36 crc kubenswrapper[4763]: I0930 14:00:36.413404 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" exitCode=0 Sep 30 14:00:36 crc kubenswrapper[4763]: I0930 14:00:36.413468 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3"} Sep 30 14:00:36 crc kubenswrapper[4763]: I0930 14:00:36.413528 4763 scope.go:117] "RemoveContainer" containerID="93b97d46ec993310482c9f94e284fd8475a6addbce7a122971ed13904ff04071" Sep 30 14:00:36 crc kubenswrapper[4763]: I0930 14:00:36.414345 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:00:36 crc kubenswrapper[4763]: E0930 14:00:36.414669 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:00:48 crc kubenswrapper[4763]: I0930 14:00:48.495570 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:00:48 crc kubenswrapper[4763]: E0930 14:00:48.496641 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:00:59 crc kubenswrapper[4763]: I0930 14:00:59.489719 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:00:59 crc kubenswrapper[4763]: E0930 14:00:59.490653 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:01:14 crc kubenswrapper[4763]: I0930 14:01:14.489512 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:01:14 crc kubenswrapper[4763]: E0930 14:01:14.490319 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:01:29 crc kubenswrapper[4763]: I0930 14:01:29.489287 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:01:29 crc kubenswrapper[4763]: E0930 14:01:29.490183 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:01:31 crc kubenswrapper[4763]: I0930 14:01:31.760882 4763 scope.go:117] "RemoveContainer" containerID="c8f854a8e0e8f8b63357c20a3ee69e40c128f3f024eaa531bfc9fe89a8b73296" Sep 30 14:01:31 crc kubenswrapper[4763]: I0930 14:01:31.797654 4763 scope.go:117] "RemoveContainer" containerID="1e418fd879a76bc974ec3882d16798e171a7acc9a5c7ba9107b332d8d6aea0fc" Sep 30 14:01:31 crc kubenswrapper[4763]: I0930 14:01:31.847484 4763 scope.go:117] "RemoveContainer" containerID="20139e3c78bc0baaef93ec4b062142a66e7644706689edf7d66fcdcdfa1ee54b" Sep 30 14:01:31 crc kubenswrapper[4763]: I0930 14:01:31.879441 4763 scope.go:117] "RemoveContainer" containerID="3e0c5a3566149a9091d7d20437254c474eb77aa49ad67fd687b671660064adfb" Sep 30 14:01:31 crc kubenswrapper[4763]: I0930 14:01:31.899267 4763 scope.go:117] "RemoveContainer" containerID="6f12ce438cdfca7bff7ec6b8d59f8bef94cce949cecfbd974e69e68743678f6d" Sep 30 14:01:31 crc kubenswrapper[4763]: I0930 14:01:31.917807 4763 scope.go:117] "RemoveContainer" containerID="6c1683e0a49795ca53dea060320152270eba1724911d0166bb8dcca337c5c33b" Sep 30 14:01:31 crc kubenswrapper[4763]: I0930 14:01:31.943178 4763 scope.go:117] "RemoveContainer" containerID="42b30ec43f1257d28794be7be6660214b1f78e8dcc9ff724d26c8c28a27d8b51" Sep 30 14:01:31 crc kubenswrapper[4763]: I0930 14:01:31.970203 4763 scope.go:117] "RemoveContainer" containerID="a33f1d59438472e65bd0b823acb92857152c410b2221a830aec663d9df6536d7" Sep 30 14:01:31 crc kubenswrapper[4763]: I0930 14:01:31.988572 4763 scope.go:117] "RemoveContainer" containerID="8a1c727d333559a452f33984696e78504154274594d0d689186dfd04e4589f8b" Sep 30 14:01:32 crc kubenswrapper[4763]: I0930 14:01:32.013685 4763 scope.go:117] "RemoveContainer" containerID="075a46a3aaa42f1f7b706ec297479cc87c77334287a9c8d5cabb52aed65fd99f" Sep 30 14:01:32 crc kubenswrapper[4763]: I0930 14:01:32.027819 4763 scope.go:117] "RemoveContainer" containerID="81acae4ba8a1fe31f7b7ec84384f6c7903c26616e109d6de747404a115029a84" Sep 30 14:01:32 crc kubenswrapper[4763]: I0930 14:01:32.052132 4763 scope.go:117] "RemoveContainer" containerID="4774b3f6ac7b157b234c17b092bac7e0ad012e21d52b9da28446843481c35238" Sep 30 14:01:32 crc kubenswrapper[4763]: I0930 14:01:32.071269 4763 scope.go:117] "RemoveContainer" containerID="b854e79c47c6f6e756da69e22feb04c256f5f94040230f706fcdea5ac2ac8dc0" Sep 30 14:01:40 crc kubenswrapper[4763]: I0930 14:01:40.490193 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:01:40 crc kubenswrapper[4763]: E0930 14:01:40.491098 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:01:51 crc kubenswrapper[4763]: I0930 14:01:51.490410 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:01:51 crc kubenswrapper[4763]: E0930 14:01:51.491158 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:02:05 crc kubenswrapper[4763]: I0930 14:02:05.489704 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:02:05 crc kubenswrapper[4763]: E0930 14:02:05.490552 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.758749 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j8gvf"] Sep 30 14:02:09 crc kubenswrapper[4763]: E0930 14:02:09.759464 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2317d90b-e233-4f27-b3bc-f60b3aaea8ea" containerName="collect-profiles" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.759478 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2317d90b-e233-4f27-b3bc-f60b3aaea8ea" containerName="collect-profiles" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.759628 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2317d90b-e233-4f27-b3bc-f60b3aaea8ea" containerName="collect-profiles" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.760650 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.766394 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j8gvf"] Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.890179 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klq5x\" (UniqueName: \"kubernetes.io/projected/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-kube-api-access-klq5x\") pod \"redhat-operators-j8gvf\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.890565 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-utilities\") pod \"redhat-operators-j8gvf\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.890719 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-catalog-content\") pod \"redhat-operators-j8gvf\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.991513 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-utilities\") pod \"redhat-operators-j8gvf\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.991572 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-catalog-content\") pod \"redhat-operators-j8gvf\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.991653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klq5x\" (UniqueName: \"kubernetes.io/projected/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-kube-api-access-klq5x\") pod \"redhat-operators-j8gvf\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.992020 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-utilities\") pod \"redhat-operators-j8gvf\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:09 crc kubenswrapper[4763]: I0930 14:02:09.992267 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-catalog-content\") pod \"redhat-operators-j8gvf\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:10 crc kubenswrapper[4763]: I0930 14:02:10.014988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klq5x\" (UniqueName: \"kubernetes.io/projected/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-kube-api-access-klq5x\") pod \"redhat-operators-j8gvf\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:10 crc kubenswrapper[4763]: I0930 14:02:10.101575 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:10 crc kubenswrapper[4763]: I0930 14:02:10.340209 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j8gvf"] Sep 30 14:02:11 crc kubenswrapper[4763]: I0930 14:02:11.239968 4763 generic.go:334] "Generic (PLEG): container finished" podID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerID="d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f" exitCode=0 Sep 30 14:02:11 crc kubenswrapper[4763]: I0930 14:02:11.240140 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8gvf" event={"ID":"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc","Type":"ContainerDied","Data":"d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f"} Sep 30 14:02:11 crc kubenswrapper[4763]: I0930 14:02:11.240537 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8gvf" event={"ID":"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc","Type":"ContainerStarted","Data":"34bb654820ae29d969aa0522633b1d4c40e0f57e1829b48a39afc909c2310a27"} Sep 30 14:02:13 crc kubenswrapper[4763]: I0930 14:02:13.257183 4763 generic.go:334] "Generic (PLEG): container finished" podID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerID="a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae" exitCode=0 Sep 30 14:02:13 crc kubenswrapper[4763]: I0930 14:02:13.257245 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8gvf" event={"ID":"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc","Type":"ContainerDied","Data":"a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae"} Sep 30 14:02:14 crc kubenswrapper[4763]: I0930 14:02:14.266853 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8gvf" event={"ID":"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc","Type":"ContainerStarted","Data":"8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903"} Sep 30 14:02:14 crc kubenswrapper[4763]: I0930 14:02:14.285111 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j8gvf" podStartSLOduration=2.471515853 podStartE2EDuration="5.285094178s" podCreationTimestamp="2025-09-30 14:02:09 +0000 UTC" firstStartedPulling="2025-09-30 14:02:11.241863727 +0000 UTC m=+1603.380424022" lastFinishedPulling="2025-09-30 14:02:14.055442062 +0000 UTC m=+1606.194002347" observedRunningTime="2025-09-30 14:02:14.283318824 +0000 UTC m=+1606.421879109" watchObservedRunningTime="2025-09-30 14:02:14.285094178 +0000 UTC m=+1606.423654463" Sep 30 14:02:17 crc kubenswrapper[4763]: I0930 14:02:17.489818 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:02:17 crc kubenswrapper[4763]: E0930 14:02:17.490295 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:02:20 crc kubenswrapper[4763]: I0930 14:02:20.102255 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:20 crc kubenswrapper[4763]: I0930 14:02:20.102755 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:20 crc kubenswrapper[4763]: I0930 14:02:20.183648 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:20 crc kubenswrapper[4763]: I0930 14:02:20.393861 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:20 crc kubenswrapper[4763]: I0930 14:02:20.454525 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j8gvf"] Sep 30 14:02:22 crc kubenswrapper[4763]: I0930 14:02:22.339834 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j8gvf" podUID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerName="registry-server" containerID="cri-o://8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903" gracePeriod=2 Sep 30 14:02:22 crc kubenswrapper[4763]: I0930 14:02:22.737569 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:22 crc kubenswrapper[4763]: I0930 14:02:22.907946 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-utilities\") pod \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " Sep 30 14:02:22 crc kubenswrapper[4763]: I0930 14:02:22.908079 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-catalog-content\") pod \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " Sep 30 14:02:22 crc kubenswrapper[4763]: I0930 14:02:22.908129 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klq5x\" (UniqueName: \"kubernetes.io/projected/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-kube-api-access-klq5x\") pod \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\" (UID: \"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc\") " Sep 30 14:02:22 crc kubenswrapper[4763]: I0930 14:02:22.908888 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-utilities" (OuterVolumeSpecName: "utilities") pod "a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" (UID: "a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:22 crc kubenswrapper[4763]: I0930 14:02:22.913347 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-kube-api-access-klq5x" (OuterVolumeSpecName: "kube-api-access-klq5x") pod "a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" (UID: "a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc"). InnerVolumeSpecName "kube-api-access-klq5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.004218 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" (UID: "a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.009690 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.009723 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.009738 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klq5x\" (UniqueName: \"kubernetes.io/projected/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc-kube-api-access-klq5x\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.349525 4763 generic.go:334] "Generic (PLEG): container finished" podID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerID="8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903" exitCode=0 Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.349640 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8gvf" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.349647 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8gvf" event={"ID":"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc","Type":"ContainerDied","Data":"8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903"} Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.350719 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8gvf" event={"ID":"a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc","Type":"ContainerDied","Data":"34bb654820ae29d969aa0522633b1d4c40e0f57e1829b48a39afc909c2310a27"} Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.350813 4763 scope.go:117] "RemoveContainer" containerID="8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.386994 4763 scope.go:117] "RemoveContainer" containerID="a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.394762 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j8gvf"] Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.399666 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j8gvf"] Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.411573 4763 scope.go:117] "RemoveContainer" containerID="d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.450369 4763 scope.go:117] "RemoveContainer" containerID="8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903" Sep 30 14:02:23 crc kubenswrapper[4763]: E0930 14:02:23.450788 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903\": container with ID starting with 8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903 not found: ID does not exist" containerID="8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.450819 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903"} err="failed to get container status \"8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903\": rpc error: code = NotFound desc = could not find container \"8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903\": container with ID starting with 8efc8b82d988e16a8439a02e74c54125fd1229832c57aa80734e8e9f2fe4f903 not found: ID does not exist" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.450844 4763 scope.go:117] "RemoveContainer" containerID="a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae" Sep 30 14:02:23 crc kubenswrapper[4763]: E0930 14:02:23.451141 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae\": container with ID starting with a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae not found: ID does not exist" containerID="a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.451172 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae"} err="failed to get container status \"a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae\": rpc error: code = NotFound desc = could not find container \"a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae\": container with ID starting with a4c92344b5d6d214e2f10d4fb8cd737a2372caaf3570e524b0c79788f685b4ae not found: ID does not exist" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.451190 4763 scope.go:117] "RemoveContainer" containerID="d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f" Sep 30 14:02:23 crc kubenswrapper[4763]: E0930 14:02:23.451422 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f\": container with ID starting with d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f not found: ID does not exist" containerID="d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f" Sep 30 14:02:23 crc kubenswrapper[4763]: I0930 14:02:23.451452 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f"} err="failed to get container status \"d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f\": rpc error: code = NotFound desc = could not find container \"d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f\": container with ID starting with d01d9b6f5e4ba0df3222968a4c1c9e9b7bb57ee8d7577d14aa05a6ad7b08d27f not found: ID does not exist" Sep 30 14:02:24 crc kubenswrapper[4763]: I0930 14:02:24.503577 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" path="/var/lib/kubelet/pods/a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc/volumes" Sep 30 14:02:32 crc kubenswrapper[4763]: I0930 14:02:32.190432 4763 scope.go:117] "RemoveContainer" containerID="88ae4d3dde2710a1efacdf602a8c10d81c64e05ba43cf75a14a42b318a53d4fe" Sep 30 14:02:32 crc kubenswrapper[4763]: I0930 14:02:32.211090 4763 scope.go:117] "RemoveContainer" containerID="d294dc3212ae359102de89c32bffc910fda43a40e264c95e34fc8674cd634dfc" Sep 30 14:02:32 crc kubenswrapper[4763]: I0930 14:02:32.230021 4763 scope.go:117] "RemoveContainer" containerID="a1f8a7e885307c1270b15dcfc52ae38978cfd08ad2beb3c528ebc1aba347578d" Sep 30 14:02:32 crc kubenswrapper[4763]: I0930 14:02:32.268095 4763 scope.go:117] "RemoveContainer" containerID="9f293e39c482f59dc3fed18cd673e0ab7b3a1b249fcc80c88a3973a815f36bca" Sep 30 14:02:32 crc kubenswrapper[4763]: I0930 14:02:32.307761 4763 scope.go:117] "RemoveContainer" containerID="8227bad438fba7b4abeec80dc38b2de97381f23f132108eaf0c0183b892f50cb" Sep 30 14:02:32 crc kubenswrapper[4763]: I0930 14:02:32.330713 4763 scope.go:117] "RemoveContainer" containerID="91399e58f45ab5b2a692a69ce7f59195c5667685f1af70d93034455a54845843" Sep 30 14:02:32 crc kubenswrapper[4763]: I0930 14:02:32.357376 4763 scope.go:117] "RemoveContainer" containerID="272acd02c53a37dfb9729b5a518d4b01ce95a61821dc87ad8baac1d2c9db8bfa" Sep 30 14:02:32 crc kubenswrapper[4763]: I0930 14:02:32.383372 4763 scope.go:117] "RemoveContainer" containerID="a7d873e31826e92d279b98b286bec38c823951291ed5ac5a558bb509d85721d5" Sep 30 14:02:32 crc kubenswrapper[4763]: I0930 14:02:32.425012 4763 scope.go:117] "RemoveContainer" containerID="686213d4b4263421d8cbb91e30ca2b44816308bdf4ea83099b02ab13fbaf1258" Sep 30 14:02:32 crc kubenswrapper[4763]: I0930 14:02:32.491975 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:02:32 crc kubenswrapper[4763]: E0930 14:02:32.492299 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:02:43 crc kubenswrapper[4763]: I0930 14:02:43.489553 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:02:43 crc kubenswrapper[4763]: E0930 14:02:43.490425 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:02:58 crc kubenswrapper[4763]: I0930 14:02:58.495087 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:02:58 crc kubenswrapper[4763]: E0930 14:02:58.498931 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:03:13 crc kubenswrapper[4763]: I0930 14:03:13.489817 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:03:13 crc kubenswrapper[4763]: E0930 14:03:13.490860 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:03:24 crc kubenswrapper[4763]: I0930 14:03:24.490050 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:03:24 crc kubenswrapper[4763]: E0930 14:03:24.491213 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:03:32 crc kubenswrapper[4763]: I0930 14:03:32.544644 4763 scope.go:117] "RemoveContainer" containerID="4f8c5e6c6bac428024dc97ceaef682e62b42b67d2f61d3a18743765dbbf6718d" Sep 30 14:03:32 crc kubenswrapper[4763]: I0930 14:03:32.567769 4763 scope.go:117] "RemoveContainer" containerID="544efce59b72529479a217ea6778472f68eff77d359bf446afeaeaba8a02a141" Sep 30 14:03:39 crc kubenswrapper[4763]: I0930 14:03:39.489687 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:03:39 crc kubenswrapper[4763]: E0930 14:03:39.490391 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:03:51 crc kubenswrapper[4763]: I0930 14:03:51.489964 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:03:51 crc kubenswrapper[4763]: E0930 14:03:51.491165 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:04:05 crc kubenswrapper[4763]: I0930 14:04:05.490226 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:04:05 crc kubenswrapper[4763]: E0930 14:04:05.491031 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:04:20 crc kubenswrapper[4763]: I0930 14:04:20.489314 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:04:20 crc kubenswrapper[4763]: E0930 14:04:20.489972 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:04:32 crc kubenswrapper[4763]: I0930 14:04:32.635850 4763 scope.go:117] "RemoveContainer" containerID="79023af8bc518b2c3e3c55cdf8786ec0640148a9740e74d8f02ab74827d3004d" Sep 30 14:04:32 crc kubenswrapper[4763]: I0930 14:04:32.661837 4763 scope.go:117] "RemoveContainer" containerID="1b5be17d856f5ffa1ef3f73b9ba58a7f55f8b6924cca7141b43e2383ae2c3208" Sep 30 14:04:33 crc kubenswrapper[4763]: I0930 14:04:33.488897 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:04:33 crc kubenswrapper[4763]: E0930 14:04:33.489106 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:04:45 crc kubenswrapper[4763]: I0930 14:04:45.489172 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:04:45 crc kubenswrapper[4763]: E0930 14:04:45.490208 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:05:00 crc kubenswrapper[4763]: I0930 14:05:00.490017 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:05:00 crc kubenswrapper[4763]: E0930 14:05:00.491105 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:05:11 crc kubenswrapper[4763]: I0930 14:05:11.489753 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:05:11 crc kubenswrapper[4763]: E0930 14:05:11.491476 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:05:22 crc kubenswrapper[4763]: I0930 14:05:22.489934 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:05:22 crc kubenswrapper[4763]: E0930 14:05:22.490509 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:05:35 crc kubenswrapper[4763]: I0930 14:05:35.489283 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:05:35 crc kubenswrapper[4763]: E0930 14:05:35.489987 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:05:48 crc kubenswrapper[4763]: I0930 14:05:48.496927 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:05:50 crc kubenswrapper[4763]: I0930 14:05:50.046336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"851b4c70bd1a23b8bd979398e3d4bcd4b1ba45ed72f4a89874505436d3a53223"} Sep 30 14:08:06 crc kubenswrapper[4763]: I0930 14:08:06.060334 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:08:06 crc kubenswrapper[4763]: I0930 14:08:06.060981 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:08:36 crc kubenswrapper[4763]: I0930 14:08:36.060048 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:08:36 crc kubenswrapper[4763]: I0930 14:08:36.060644 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:08:49 crc kubenswrapper[4763]: I0930 14:08:49.748230 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lp7kv"] Sep 30 14:08:49 crc kubenswrapper[4763]: E0930 14:08:49.749158 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerName="extract-content" Sep 30 14:08:49 crc kubenswrapper[4763]: I0930 14:08:49.749174 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerName="extract-content" Sep 30 14:08:49 crc kubenswrapper[4763]: E0930 14:08:49.749202 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerName="registry-server" Sep 30 14:08:49 crc kubenswrapper[4763]: I0930 14:08:49.749209 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerName="registry-server" Sep 30 14:08:49 crc kubenswrapper[4763]: E0930 14:08:49.749220 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerName="extract-utilities" Sep 30 14:08:49 crc kubenswrapper[4763]: I0930 14:08:49.749227 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerName="extract-utilities" Sep 30 14:08:49 crc kubenswrapper[4763]: I0930 14:08:49.749430 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a997fcb3-0e0d-4508-8eb3-bb797ee5b8dc" containerName="registry-server" Sep 30 14:08:49 crc kubenswrapper[4763]: I0930 14:08:49.750817 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:49 crc kubenswrapper[4763]: I0930 14:08:49.765354 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lp7kv"] Sep 30 14:08:49 crc kubenswrapper[4763]: I0930 14:08:49.946201 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-catalog-content\") pod \"redhat-marketplace-lp7kv\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:49 crc kubenswrapper[4763]: I0930 14:08:49.947147 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-utilities\") pod \"redhat-marketplace-lp7kv\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:49 crc kubenswrapper[4763]: I0930 14:08:49.947223 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpnwz\" (UniqueName: \"kubernetes.io/projected/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-kube-api-access-mpnwz\") pod \"redhat-marketplace-lp7kv\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:50 crc kubenswrapper[4763]: I0930 14:08:50.048506 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-catalog-content\") pod \"redhat-marketplace-lp7kv\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:50 crc kubenswrapper[4763]: I0930 14:08:50.048585 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-utilities\") pod \"redhat-marketplace-lp7kv\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:50 crc kubenswrapper[4763]: I0930 14:08:50.048635 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpnwz\" (UniqueName: \"kubernetes.io/projected/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-kube-api-access-mpnwz\") pod \"redhat-marketplace-lp7kv\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:50 crc kubenswrapper[4763]: I0930 14:08:50.049150 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-catalog-content\") pod \"redhat-marketplace-lp7kv\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:50 crc kubenswrapper[4763]: I0930 14:08:50.049227 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-utilities\") pod \"redhat-marketplace-lp7kv\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:50 crc kubenswrapper[4763]: I0930 14:08:50.068131 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpnwz\" (UniqueName: \"kubernetes.io/projected/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-kube-api-access-mpnwz\") pod \"redhat-marketplace-lp7kv\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:50 crc kubenswrapper[4763]: I0930 14:08:50.074539 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:08:50 crc kubenswrapper[4763]: I0930 14:08:50.511092 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lp7kv"] Sep 30 14:08:50 crc kubenswrapper[4763]: W0930 14:08:50.514180 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dfcef37_fc28_49af_9f6e_2c05ae63df4d.slice/crio-8b8b44240ab4de512437d194ff9d55d6608fe4ebe8cc002d54b18c10b33f5899 WatchSource:0}: Error finding container 8b8b44240ab4de512437d194ff9d55d6608fe4ebe8cc002d54b18c10b33f5899: Status 404 returned error can't find the container with id 8b8b44240ab4de512437d194ff9d55d6608fe4ebe8cc002d54b18c10b33f5899 Sep 30 14:08:51 crc kubenswrapper[4763]: I0930 14:08:51.337714 4763 generic.go:334] "Generic (PLEG): container finished" podID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerID="c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a" exitCode=0 Sep 30 14:08:51 crc kubenswrapper[4763]: I0930 14:08:51.337789 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp7kv" event={"ID":"8dfcef37-fc28-49af-9f6e-2c05ae63df4d","Type":"ContainerDied","Data":"c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a"} Sep 30 14:08:51 crc kubenswrapper[4763]: I0930 14:08:51.338061 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp7kv" event={"ID":"8dfcef37-fc28-49af-9f6e-2c05ae63df4d","Type":"ContainerStarted","Data":"8b8b44240ab4de512437d194ff9d55d6608fe4ebe8cc002d54b18c10b33f5899"} Sep 30 14:08:51 crc kubenswrapper[4763]: I0930 14:08:51.342061 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:08:53 crc kubenswrapper[4763]: I0930 14:08:53.352670 4763 generic.go:334] "Generic (PLEG): container finished" podID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerID="5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58" exitCode=0 Sep 30 14:08:53 crc kubenswrapper[4763]: I0930 14:08:53.352879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp7kv" event={"ID":"8dfcef37-fc28-49af-9f6e-2c05ae63df4d","Type":"ContainerDied","Data":"5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58"} Sep 30 14:08:54 crc kubenswrapper[4763]: I0930 14:08:54.361831 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp7kv" event={"ID":"8dfcef37-fc28-49af-9f6e-2c05ae63df4d","Type":"ContainerStarted","Data":"5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8"} Sep 30 14:08:54 crc kubenswrapper[4763]: I0930 14:08:54.383252 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lp7kv" podStartSLOduration=2.9705117640000003 podStartE2EDuration="5.383233545s" podCreationTimestamp="2025-09-30 14:08:49 +0000 UTC" firstStartedPulling="2025-09-30 14:08:51.341691407 +0000 UTC m=+2003.480251732" lastFinishedPulling="2025-09-30 14:08:53.754413188 +0000 UTC m=+2005.892973513" observedRunningTime="2025-09-30 14:08:54.377139312 +0000 UTC m=+2006.515699597" watchObservedRunningTime="2025-09-30 14:08:54.383233545 +0000 UTC m=+2006.521793840" Sep 30 14:09:00 crc kubenswrapper[4763]: I0930 14:09:00.075680 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:09:00 crc kubenswrapper[4763]: I0930 14:09:00.076351 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:09:00 crc kubenswrapper[4763]: I0930 14:09:00.123355 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:09:00 crc kubenswrapper[4763]: I0930 14:09:00.482007 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:09:00 crc kubenswrapper[4763]: I0930 14:09:00.526384 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lp7kv"] Sep 30 14:09:02 crc kubenswrapper[4763]: I0930 14:09:02.422459 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lp7kv" podUID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerName="registry-server" containerID="cri-o://5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8" gracePeriod=2 Sep 30 14:09:02 crc kubenswrapper[4763]: I0930 14:09:02.798938 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:09:02 crc kubenswrapper[4763]: I0930 14:09:02.931128 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpnwz\" (UniqueName: \"kubernetes.io/projected/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-kube-api-access-mpnwz\") pod \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " Sep 30 14:09:02 crc kubenswrapper[4763]: I0930 14:09:02.931225 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-catalog-content\") pod \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " Sep 30 14:09:02 crc kubenswrapper[4763]: I0930 14:09:02.931355 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-utilities\") pod \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\" (UID: \"8dfcef37-fc28-49af-9f6e-2c05ae63df4d\") " Sep 30 14:09:02 crc kubenswrapper[4763]: I0930 14:09:02.932459 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-utilities" (OuterVolumeSpecName: "utilities") pod "8dfcef37-fc28-49af-9f6e-2c05ae63df4d" (UID: "8dfcef37-fc28-49af-9f6e-2c05ae63df4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:09:02 crc kubenswrapper[4763]: I0930 14:09:02.938551 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-kube-api-access-mpnwz" (OuterVolumeSpecName: "kube-api-access-mpnwz") pod "8dfcef37-fc28-49af-9f6e-2c05ae63df4d" (UID: "8dfcef37-fc28-49af-9f6e-2c05ae63df4d"). InnerVolumeSpecName "kube-api-access-mpnwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:09:02 crc kubenswrapper[4763]: I0930 14:09:02.944523 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8dfcef37-fc28-49af-9f6e-2c05ae63df4d" (UID: "8dfcef37-fc28-49af-9f6e-2c05ae63df4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.033477 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpnwz\" (UniqueName: \"kubernetes.io/projected/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-kube-api-access-mpnwz\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.033525 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.033538 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfcef37-fc28-49af-9f6e-2c05ae63df4d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.431560 4763 generic.go:334] "Generic (PLEG): container finished" podID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerID="5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8" exitCode=0 Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.431648 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lp7kv" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.431670 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp7kv" event={"ID":"8dfcef37-fc28-49af-9f6e-2c05ae63df4d","Type":"ContainerDied","Data":"5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8"} Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.432572 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp7kv" event={"ID":"8dfcef37-fc28-49af-9f6e-2c05ae63df4d","Type":"ContainerDied","Data":"8b8b44240ab4de512437d194ff9d55d6608fe4ebe8cc002d54b18c10b33f5899"} Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.432616 4763 scope.go:117] "RemoveContainer" containerID="5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.465825 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lp7kv"] Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.471737 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lp7kv"] Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.472424 4763 scope.go:117] "RemoveContainer" containerID="5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.491952 4763 scope.go:117] "RemoveContainer" containerID="c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.517132 4763 scope.go:117] "RemoveContainer" containerID="5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8" Sep 30 14:09:03 crc kubenswrapper[4763]: E0930 14:09:03.517480 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8\": container with ID starting with 5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8 not found: ID does not exist" containerID="5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.517515 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8"} err="failed to get container status \"5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8\": rpc error: code = NotFound desc = could not find container \"5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8\": container with ID starting with 5ba9ff4b4354a30cb6972338c18b8ad0f5afa5916892f92b5b240a91797a73d8 not found: ID does not exist" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.517535 4763 scope.go:117] "RemoveContainer" containerID="5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58" Sep 30 14:09:03 crc kubenswrapper[4763]: E0930 14:09:03.517761 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58\": container with ID starting with 5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58 not found: ID does not exist" containerID="5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.517785 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58"} err="failed to get container status \"5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58\": rpc error: code = NotFound desc = could not find container \"5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58\": container with ID starting with 5c4ebe17e6f858bd4b9eb79f07a857106fd290f51f81ceafad7b3705ad431a58 not found: ID does not exist" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.517799 4763 scope.go:117] "RemoveContainer" containerID="c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a" Sep 30 14:09:03 crc kubenswrapper[4763]: E0930 14:09:03.518420 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a\": container with ID starting with c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a not found: ID does not exist" containerID="c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a" Sep 30 14:09:03 crc kubenswrapper[4763]: I0930 14:09:03.518444 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a"} err="failed to get container status \"c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a\": rpc error: code = NotFound desc = could not find container \"c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a\": container with ID starting with c3f585a4922516c406d5adc764d86699cb7f658085ded2eee46297052cca4d4a not found: ID does not exist" Sep 30 14:09:04 crc kubenswrapper[4763]: I0930 14:09:04.501282 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" path="/var/lib/kubelet/pods/8dfcef37-fc28-49af-9f6e-2c05ae63df4d/volumes" Sep 30 14:09:06 crc kubenswrapper[4763]: I0930 14:09:06.059346 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:09:06 crc kubenswrapper[4763]: I0930 14:09:06.059641 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:09:06 crc kubenswrapper[4763]: I0930 14:09:06.059684 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:09:06 crc kubenswrapper[4763]: I0930 14:09:06.060174 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"851b4c70bd1a23b8bd979398e3d4bcd4b1ba45ed72f4a89874505436d3a53223"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:09:06 crc kubenswrapper[4763]: I0930 14:09:06.060215 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://851b4c70bd1a23b8bd979398e3d4bcd4b1ba45ed72f4a89874505436d3a53223" gracePeriod=600 Sep 30 14:09:06 crc kubenswrapper[4763]: I0930 14:09:06.463511 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="851b4c70bd1a23b8bd979398e3d4bcd4b1ba45ed72f4a89874505436d3a53223" exitCode=0 Sep 30 14:09:06 crc kubenswrapper[4763]: I0930 14:09:06.463618 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"851b4c70bd1a23b8bd979398e3d4bcd4b1ba45ed72f4a89874505436d3a53223"} Sep 30 14:09:06 crc kubenswrapper[4763]: I0930 14:09:06.463964 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e"} Sep 30 14:09:06 crc kubenswrapper[4763]: I0930 14:09:06.463993 4763 scope.go:117] "RemoveContainer" containerID="557200a56b20b88c1d05055942e670f2ed834e8f99d12c689800d7472e0295d3" Sep 30 14:09:47 crc kubenswrapper[4763]: I0930 14:09:47.915318 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gqd6r"] Sep 30 14:09:47 crc kubenswrapper[4763]: E0930 14:09:47.916318 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerName="extract-content" Sep 30 14:09:47 crc kubenswrapper[4763]: I0930 14:09:47.916341 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerName="extract-content" Sep 30 14:09:47 crc kubenswrapper[4763]: E0930 14:09:47.916369 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerName="extract-utilities" Sep 30 14:09:47 crc kubenswrapper[4763]: I0930 14:09:47.916380 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerName="extract-utilities" Sep 30 14:09:47 crc kubenswrapper[4763]: E0930 14:09:47.916403 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerName="registry-server" Sep 30 14:09:47 crc kubenswrapper[4763]: I0930 14:09:47.916415 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerName="registry-server" Sep 30 14:09:47 crc kubenswrapper[4763]: I0930 14:09:47.916675 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfcef37-fc28-49af-9f6e-2c05ae63df4d" containerName="registry-server" Sep 30 14:09:47 crc kubenswrapper[4763]: I0930 14:09:47.918240 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:47 crc kubenswrapper[4763]: I0930 14:09:47.933786 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqd6r"] Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.020952 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-catalog-content\") pod \"community-operators-gqd6r\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.021121 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqtxn\" (UniqueName: \"kubernetes.io/projected/43bcdc16-5240-41df-8aad-cd7be04fcead-kube-api-access-qqtxn\") pod \"community-operators-gqd6r\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.021219 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-utilities\") pod \"community-operators-gqd6r\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.122697 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-utilities\") pod \"community-operators-gqd6r\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.122820 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-catalog-content\") pod \"community-operators-gqd6r\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.122883 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqtxn\" (UniqueName: \"kubernetes.io/projected/43bcdc16-5240-41df-8aad-cd7be04fcead-kube-api-access-qqtxn\") pod \"community-operators-gqd6r\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.123304 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-utilities\") pod \"community-operators-gqd6r\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.123321 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-catalog-content\") pod \"community-operators-gqd6r\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.155784 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqtxn\" (UniqueName: \"kubernetes.io/projected/43bcdc16-5240-41df-8aad-cd7be04fcead-kube-api-access-qqtxn\") pod \"community-operators-gqd6r\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.240655 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.736253 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqd6r"] Sep 30 14:09:48 crc kubenswrapper[4763]: I0930 14:09:48.789565 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqd6r" event={"ID":"43bcdc16-5240-41df-8aad-cd7be04fcead","Type":"ContainerStarted","Data":"c364a1a10a736cacfe625fa6729c2c0ef50c3c3530ef8ddfd5be880daef6c673"} Sep 30 14:09:49 crc kubenswrapper[4763]: I0930 14:09:49.798328 4763 generic.go:334] "Generic (PLEG): container finished" podID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerID="88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733" exitCode=0 Sep 30 14:09:49 crc kubenswrapper[4763]: I0930 14:09:49.798419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqd6r" event={"ID":"43bcdc16-5240-41df-8aad-cd7be04fcead","Type":"ContainerDied","Data":"88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733"} Sep 30 14:09:51 crc kubenswrapper[4763]: I0930 14:09:51.812563 4763 generic.go:334] "Generic (PLEG): container finished" podID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerID="56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de" exitCode=0 Sep 30 14:09:51 crc kubenswrapper[4763]: I0930 14:09:51.813030 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqd6r" event={"ID":"43bcdc16-5240-41df-8aad-cd7be04fcead","Type":"ContainerDied","Data":"56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de"} Sep 30 14:09:52 crc kubenswrapper[4763]: I0930 14:09:52.821846 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqd6r" event={"ID":"43bcdc16-5240-41df-8aad-cd7be04fcead","Type":"ContainerStarted","Data":"a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349"} Sep 30 14:09:52 crc kubenswrapper[4763]: I0930 14:09:52.842667 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gqd6r" podStartSLOduration=3.288960749 podStartE2EDuration="5.842650223s" podCreationTimestamp="2025-09-30 14:09:47 +0000 UTC" firstStartedPulling="2025-09-30 14:09:49.800401571 +0000 UTC m=+2061.938961856" lastFinishedPulling="2025-09-30 14:09:52.354091045 +0000 UTC m=+2064.492651330" observedRunningTime="2025-09-30 14:09:52.84050068 +0000 UTC m=+2064.979060965" watchObservedRunningTime="2025-09-30 14:09:52.842650223 +0000 UTC m=+2064.981210528" Sep 30 14:09:58 crc kubenswrapper[4763]: I0930 14:09:58.241587 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:58 crc kubenswrapper[4763]: I0930 14:09:58.242037 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:58 crc kubenswrapper[4763]: I0930 14:09:58.287729 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:58 crc kubenswrapper[4763]: I0930 14:09:58.900584 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:09:58 crc kubenswrapper[4763]: I0930 14:09:58.947466 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gqd6r"] Sep 30 14:10:00 crc kubenswrapper[4763]: I0930 14:10:00.873629 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gqd6r" podUID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerName="registry-server" containerID="cri-o://a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349" gracePeriod=2 Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.270410 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.405566 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-utilities\") pod \"43bcdc16-5240-41df-8aad-cd7be04fcead\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.405689 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqtxn\" (UniqueName: \"kubernetes.io/projected/43bcdc16-5240-41df-8aad-cd7be04fcead-kube-api-access-qqtxn\") pod \"43bcdc16-5240-41df-8aad-cd7be04fcead\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.405728 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-catalog-content\") pod \"43bcdc16-5240-41df-8aad-cd7be04fcead\" (UID: \"43bcdc16-5240-41df-8aad-cd7be04fcead\") " Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.406326 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-utilities" (OuterVolumeSpecName: "utilities") pod "43bcdc16-5240-41df-8aad-cd7be04fcead" (UID: "43bcdc16-5240-41df-8aad-cd7be04fcead"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.418847 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43bcdc16-5240-41df-8aad-cd7be04fcead-kube-api-access-qqtxn" (OuterVolumeSpecName: "kube-api-access-qqtxn") pod "43bcdc16-5240-41df-8aad-cd7be04fcead" (UID: "43bcdc16-5240-41df-8aad-cd7be04fcead"). InnerVolumeSpecName "kube-api-access-qqtxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.507487 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.507537 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqtxn\" (UniqueName: \"kubernetes.io/projected/43bcdc16-5240-41df-8aad-cd7be04fcead-kube-api-access-qqtxn\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.885058 4763 generic.go:334] "Generic (PLEG): container finished" podID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerID="a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349" exitCode=0 Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.885126 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqd6r" event={"ID":"43bcdc16-5240-41df-8aad-cd7be04fcead","Type":"ContainerDied","Data":"a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349"} Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.885167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqd6r" event={"ID":"43bcdc16-5240-41df-8aad-cd7be04fcead","Type":"ContainerDied","Data":"c364a1a10a736cacfe625fa6729c2c0ef50c3c3530ef8ddfd5be880daef6c673"} Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.885196 4763 scope.go:117] "RemoveContainer" containerID="a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.885384 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqd6r" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.910004 4763 scope.go:117] "RemoveContainer" containerID="56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.929915 4763 scope.go:117] "RemoveContainer" containerID="88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.969384 4763 scope.go:117] "RemoveContainer" containerID="a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349" Sep 30 14:10:01 crc kubenswrapper[4763]: E0930 14:10:01.969989 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349\": container with ID starting with a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349 not found: ID does not exist" containerID="a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.970030 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349"} err="failed to get container status \"a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349\": rpc error: code = NotFound desc = could not find container \"a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349\": container with ID starting with a0e0ffbe27a212d59b2d1bb9f82c8d3e87984d7dd0664efc8dc4003715f83349 not found: ID does not exist" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.970056 4763 scope.go:117] "RemoveContainer" containerID="56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de" Sep 30 14:10:01 crc kubenswrapper[4763]: E0930 14:10:01.970893 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de\": container with ID starting with 56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de not found: ID does not exist" containerID="56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.970925 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de"} err="failed to get container status \"56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de\": rpc error: code = NotFound desc = could not find container \"56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de\": container with ID starting with 56ed887ac2ef9fa7b24449226c5ce6726e116bd636f129a8f243da62c3cb09de not found: ID does not exist" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.970941 4763 scope.go:117] "RemoveContainer" containerID="88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733" Sep 30 14:10:01 crc kubenswrapper[4763]: E0930 14:10:01.971390 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733\": container with ID starting with 88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733 not found: ID does not exist" containerID="88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733" Sep 30 14:10:01 crc kubenswrapper[4763]: I0930 14:10:01.971416 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733"} err="failed to get container status \"88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733\": rpc error: code = NotFound desc = could not find container \"88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733\": container with ID starting with 88fee8740fdebeb51006627f5ab1ddf416412e6d57464dab2f36771617948733 not found: ID does not exist" Sep 30 14:10:02 crc kubenswrapper[4763]: I0930 14:10:02.005803 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43bcdc16-5240-41df-8aad-cd7be04fcead" (UID: "43bcdc16-5240-41df-8aad-cd7be04fcead"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:10:02 crc kubenswrapper[4763]: I0930 14:10:02.015194 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43bcdc16-5240-41df-8aad-cd7be04fcead-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:02 crc kubenswrapper[4763]: I0930 14:10:02.214944 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gqd6r"] Sep 30 14:10:02 crc kubenswrapper[4763]: I0930 14:10:02.221269 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gqd6r"] Sep 30 14:10:02 crc kubenswrapper[4763]: I0930 14:10:02.498714 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43bcdc16-5240-41df-8aad-cd7be04fcead" path="/var/lib/kubelet/pods/43bcdc16-5240-41df-8aad-cd7be04fcead/volumes" Sep 30 14:11:06 crc kubenswrapper[4763]: I0930 14:11:06.059301 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:11:06 crc kubenswrapper[4763]: I0930 14:11:06.059924 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:11:36 crc kubenswrapper[4763]: I0930 14:11:36.059681 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:11:36 crc kubenswrapper[4763]: I0930 14:11:36.060369 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:12:06 crc kubenswrapper[4763]: I0930 14:12:06.060480 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:12:06 crc kubenswrapper[4763]: I0930 14:12:06.061157 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:12:06 crc kubenswrapper[4763]: I0930 14:12:06.061216 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:12:06 crc kubenswrapper[4763]: I0930 14:12:06.061838 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:12:06 crc kubenswrapper[4763]: I0930 14:12:06.061907 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" gracePeriod=600 Sep 30 14:12:06 crc kubenswrapper[4763]: E0930 14:12:06.187508 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:12:06 crc kubenswrapper[4763]: I0930 14:12:06.773728 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" exitCode=0 Sep 30 14:12:06 crc kubenswrapper[4763]: I0930 14:12:06.773778 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e"} Sep 30 14:12:06 crc kubenswrapper[4763]: I0930 14:12:06.773817 4763 scope.go:117] "RemoveContainer" containerID="851b4c70bd1a23b8bd979398e3d4bcd4b1ba45ed72f4a89874505436d3a53223" Sep 30 14:12:06 crc kubenswrapper[4763]: I0930 14:12:06.774423 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:12:06 crc kubenswrapper[4763]: E0930 14:12:06.774686 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:12:19 crc kubenswrapper[4763]: I0930 14:12:19.489593 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:12:19 crc kubenswrapper[4763]: E0930 14:12:19.490490 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:12:31 crc kubenswrapper[4763]: I0930 14:12:31.489916 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:12:31 crc kubenswrapper[4763]: E0930 14:12:31.490956 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:12:45 crc kubenswrapper[4763]: I0930 14:12:45.489084 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:12:45 crc kubenswrapper[4763]: E0930 14:12:45.490002 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:13:00 crc kubenswrapper[4763]: I0930 14:13:00.490166 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:13:00 crc kubenswrapper[4763]: E0930 14:13:00.490986 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.123852 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mp4wr"] Sep 30 14:13:09 crc kubenswrapper[4763]: E0930 14:13:09.124532 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerName="extract-utilities" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.124548 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerName="extract-utilities" Sep 30 14:13:09 crc kubenswrapper[4763]: E0930 14:13:09.124561 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerName="extract-content" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.124566 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerName="extract-content" Sep 30 14:13:09 crc kubenswrapper[4763]: E0930 14:13:09.124623 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerName="registry-server" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.124630 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerName="registry-server" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.124786 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="43bcdc16-5240-41df-8aad-cd7be04fcead" containerName="registry-server" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.125863 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.136981 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mp4wr"] Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.262627 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620c7c8-bb53-44f3-8723-828bf69bb55e-catalog-content\") pod \"redhat-operators-mp4wr\" (UID: \"b620c7c8-bb53-44f3-8723-828bf69bb55e\") " pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.263063 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620c7c8-bb53-44f3-8723-828bf69bb55e-utilities\") pod \"redhat-operators-mp4wr\" (UID: \"b620c7c8-bb53-44f3-8723-828bf69bb55e\") " pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.263117 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpgzn\" (UniqueName: \"kubernetes.io/projected/b620c7c8-bb53-44f3-8723-828bf69bb55e-kube-api-access-rpgzn\") pod \"redhat-operators-mp4wr\" (UID: \"b620c7c8-bb53-44f3-8723-828bf69bb55e\") " pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.364951 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620c7c8-bb53-44f3-8723-828bf69bb55e-utilities\") pod \"redhat-operators-mp4wr\" (UID: \"b620c7c8-bb53-44f3-8723-828bf69bb55e\") " pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.365009 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpgzn\" (UniqueName: \"kubernetes.io/projected/b620c7c8-bb53-44f3-8723-828bf69bb55e-kube-api-access-rpgzn\") pod \"redhat-operators-mp4wr\" (UID: \"b620c7c8-bb53-44f3-8723-828bf69bb55e\") " pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.365050 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620c7c8-bb53-44f3-8723-828bf69bb55e-catalog-content\") pod \"redhat-operators-mp4wr\" (UID: \"b620c7c8-bb53-44f3-8723-828bf69bb55e\") " pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.365494 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620c7c8-bb53-44f3-8723-828bf69bb55e-utilities\") pod \"redhat-operators-mp4wr\" (UID: \"b620c7c8-bb53-44f3-8723-828bf69bb55e\") " pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.365545 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620c7c8-bb53-44f3-8723-828bf69bb55e-catalog-content\") pod \"redhat-operators-mp4wr\" (UID: \"b620c7c8-bb53-44f3-8723-828bf69bb55e\") " pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.384059 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpgzn\" (UniqueName: \"kubernetes.io/projected/b620c7c8-bb53-44f3-8723-828bf69bb55e-kube-api-access-rpgzn\") pod \"redhat-operators-mp4wr\" (UID: \"b620c7c8-bb53-44f3-8723-828bf69bb55e\") " pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.455180 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:09 crc kubenswrapper[4763]: I0930 14:13:09.892737 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mp4wr"] Sep 30 14:13:10 crc kubenswrapper[4763]: I0930 14:13:10.236897 4763 generic.go:334] "Generic (PLEG): container finished" podID="b620c7c8-bb53-44f3-8723-828bf69bb55e" containerID="9f00bfc6d1ce936e64296d8152cd55d480f9399526406c2dcdd879dc5533e4da" exitCode=0 Sep 30 14:13:10 crc kubenswrapper[4763]: I0930 14:13:10.236939 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mp4wr" event={"ID":"b620c7c8-bb53-44f3-8723-828bf69bb55e","Type":"ContainerDied","Data":"9f00bfc6d1ce936e64296d8152cd55d480f9399526406c2dcdd879dc5533e4da"} Sep 30 14:13:10 crc kubenswrapper[4763]: I0930 14:13:10.236965 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mp4wr" event={"ID":"b620c7c8-bb53-44f3-8723-828bf69bb55e","Type":"ContainerStarted","Data":"bcfb7bd7f26075910671aabdd52a24dc9b45c3d39fb50834588f367b8d7995d0"} Sep 30 14:13:12 crc kubenswrapper[4763]: I0930 14:13:12.489128 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:13:12 crc kubenswrapper[4763]: E0930 14:13:12.489615 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:13:17 crc kubenswrapper[4763]: I0930 14:13:17.287639 4763 generic.go:334] "Generic (PLEG): container finished" podID="b620c7c8-bb53-44f3-8723-828bf69bb55e" containerID="79f11da6ab602d24f581ea6712d89e875841c719e6bc031ecb3c8afe28fe8854" exitCode=0 Sep 30 14:13:17 crc kubenswrapper[4763]: I0930 14:13:17.287725 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mp4wr" event={"ID":"b620c7c8-bb53-44f3-8723-828bf69bb55e","Type":"ContainerDied","Data":"79f11da6ab602d24f581ea6712d89e875841c719e6bc031ecb3c8afe28fe8854"} Sep 30 14:13:18 crc kubenswrapper[4763]: I0930 14:13:18.296750 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mp4wr" event={"ID":"b620c7c8-bb53-44f3-8723-828bf69bb55e","Type":"ContainerStarted","Data":"176f91b08e5ab7a8b99bfdb50bfb10016495c96e450662384ca0762f8d8309b6"} Sep 30 14:13:18 crc kubenswrapper[4763]: I0930 14:13:18.316459 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mp4wr" podStartSLOduration=1.8162525870000001 podStartE2EDuration="9.316441496s" podCreationTimestamp="2025-09-30 14:13:09 +0000 UTC" firstStartedPulling="2025-09-30 14:13:10.238194584 +0000 UTC m=+2262.376754869" lastFinishedPulling="2025-09-30 14:13:17.738383473 +0000 UTC m=+2269.876943778" observedRunningTime="2025-09-30 14:13:18.315818171 +0000 UTC m=+2270.454378456" watchObservedRunningTime="2025-09-30 14:13:18.316441496 +0000 UTC m=+2270.455001781" Sep 30 14:13:19 crc kubenswrapper[4763]: I0930 14:13:19.455473 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:19 crc kubenswrapper[4763]: I0930 14:13:19.455899 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:20 crc kubenswrapper[4763]: I0930 14:13:20.495275 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mp4wr" podUID="b620c7c8-bb53-44f3-8723-828bf69bb55e" containerName="registry-server" probeResult="failure" output=< Sep 30 14:13:20 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Sep 30 14:13:20 crc kubenswrapper[4763]: > Sep 30 14:13:25 crc kubenswrapper[4763]: I0930 14:13:25.489389 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:13:25 crc kubenswrapper[4763]: E0930 14:13:25.490231 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:13:29 crc kubenswrapper[4763]: I0930 14:13:29.505767 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:29 crc kubenswrapper[4763]: I0930 14:13:29.551194 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mp4wr" Sep 30 14:13:29 crc kubenswrapper[4763]: I0930 14:13:29.625100 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mp4wr"] Sep 30 14:13:29 crc kubenswrapper[4763]: I0930 14:13:29.740447 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4fzf"] Sep 30 14:13:29 crc kubenswrapper[4763]: I0930 14:13:29.741236 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h4fzf" podUID="c69337fe-42df-4d48-8254-9408d35e644c" containerName="registry-server" containerID="cri-o://fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409" gracePeriod=2 Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.162989 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.361592 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxbpj\" (UniqueName: \"kubernetes.io/projected/c69337fe-42df-4d48-8254-9408d35e644c-kube-api-access-kxbpj\") pod \"c69337fe-42df-4d48-8254-9408d35e644c\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.361719 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-catalog-content\") pod \"c69337fe-42df-4d48-8254-9408d35e644c\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.361799 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-utilities\") pod \"c69337fe-42df-4d48-8254-9408d35e644c\" (UID: \"c69337fe-42df-4d48-8254-9408d35e644c\") " Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.362403 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-utilities" (OuterVolumeSpecName: "utilities") pod "c69337fe-42df-4d48-8254-9408d35e644c" (UID: "c69337fe-42df-4d48-8254-9408d35e644c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.367200 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69337fe-42df-4d48-8254-9408d35e644c-kube-api-access-kxbpj" (OuterVolumeSpecName: "kube-api-access-kxbpj") pod "c69337fe-42df-4d48-8254-9408d35e644c" (UID: "c69337fe-42df-4d48-8254-9408d35e644c"). InnerVolumeSpecName "kube-api-access-kxbpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.411131 4763 generic.go:334] "Generic (PLEG): container finished" podID="c69337fe-42df-4d48-8254-9408d35e644c" containerID="fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409" exitCode=0 Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.411205 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4fzf" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.411223 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4fzf" event={"ID":"c69337fe-42df-4d48-8254-9408d35e644c","Type":"ContainerDied","Data":"fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409"} Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.411312 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4fzf" event={"ID":"c69337fe-42df-4d48-8254-9408d35e644c","Type":"ContainerDied","Data":"5a0f38d3558878cafa483b1ba5cb7383025c38312a61ff0134add7c80fdda98f"} Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.411377 4763 scope.go:117] "RemoveContainer" containerID="fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.424272 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c69337fe-42df-4d48-8254-9408d35e644c" (UID: "c69337fe-42df-4d48-8254-9408d35e644c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.430140 4763 scope.go:117] "RemoveContainer" containerID="0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.458423 4763 scope.go:117] "RemoveContainer" containerID="67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.463826 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxbpj\" (UniqueName: \"kubernetes.io/projected/c69337fe-42df-4d48-8254-9408d35e644c-kube-api-access-kxbpj\") on node \"crc\" DevicePath \"\"" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.463861 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.463872 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69337fe-42df-4d48-8254-9408d35e644c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.472089 4763 scope.go:117] "RemoveContainer" containerID="fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409" Sep 30 14:13:30 crc kubenswrapper[4763]: E0930 14:13:30.472519 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409\": container with ID starting with fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409 not found: ID does not exist" containerID="fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.472563 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409"} err="failed to get container status \"fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409\": rpc error: code = NotFound desc = could not find container \"fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409\": container with ID starting with fdb61f0f326fadd0ae2f275c359225eb23393e73a685bc96a72c2c0d3b515409 not found: ID does not exist" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.472591 4763 scope.go:117] "RemoveContainer" containerID="0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80" Sep 30 14:13:30 crc kubenswrapper[4763]: E0930 14:13:30.472984 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80\": container with ID starting with 0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80 not found: ID does not exist" containerID="0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.473023 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80"} err="failed to get container status \"0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80\": rpc error: code = NotFound desc = could not find container \"0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80\": container with ID starting with 0e3456fba7226e7cb950894b9a7bbe939816731c14d9ba05ec0b1733df8bff80 not found: ID does not exist" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.473047 4763 scope.go:117] "RemoveContainer" containerID="67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee" Sep 30 14:13:30 crc kubenswrapper[4763]: E0930 14:13:30.473256 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee\": container with ID starting with 67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee not found: ID does not exist" containerID="67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.473285 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee"} err="failed to get container status \"67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee\": rpc error: code = NotFound desc = could not find container \"67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee\": container with ID starting with 67d34b5d7aea0fe5b17cfb8de200769e0179fe500e6b8927760563ac2282eaee not found: ID does not exist" Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.729226 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4fzf"] Sep 30 14:13:30 crc kubenswrapper[4763]: I0930 14:13:30.743858 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h4fzf"] Sep 30 14:13:32 crc kubenswrapper[4763]: I0930 14:13:32.497938 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c69337fe-42df-4d48-8254-9408d35e644c" path="/var/lib/kubelet/pods/c69337fe-42df-4d48-8254-9408d35e644c/volumes" Sep 30 14:13:36 crc kubenswrapper[4763]: I0930 14:13:36.489222 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:13:36 crc kubenswrapper[4763]: E0930 14:13:36.490768 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:13:50 crc kubenswrapper[4763]: I0930 14:13:50.489528 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:13:50 crc kubenswrapper[4763]: E0930 14:13:50.491277 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:14:01 crc kubenswrapper[4763]: I0930 14:14:01.488887 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:14:01 crc kubenswrapper[4763]: E0930 14:14:01.489520 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:14:10 crc kubenswrapper[4763]: I0930 14:14:10.749928 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h9rhb"] Sep 30 14:14:10 crc kubenswrapper[4763]: E0930 14:14:10.750976 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69337fe-42df-4d48-8254-9408d35e644c" containerName="extract-content" Sep 30 14:14:10 crc kubenswrapper[4763]: I0930 14:14:10.750993 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69337fe-42df-4d48-8254-9408d35e644c" containerName="extract-content" Sep 30 14:14:10 crc kubenswrapper[4763]: E0930 14:14:10.751048 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69337fe-42df-4d48-8254-9408d35e644c" containerName="registry-server" Sep 30 14:14:10 crc kubenswrapper[4763]: I0930 14:14:10.751057 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69337fe-42df-4d48-8254-9408d35e644c" containerName="registry-server" Sep 30 14:14:10 crc kubenswrapper[4763]: E0930 14:14:10.751077 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69337fe-42df-4d48-8254-9408d35e644c" containerName="extract-utilities" Sep 30 14:14:10 crc kubenswrapper[4763]: I0930 14:14:10.751088 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69337fe-42df-4d48-8254-9408d35e644c" containerName="extract-utilities" Sep 30 14:14:10 crc kubenswrapper[4763]: I0930 14:14:10.751308 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69337fe-42df-4d48-8254-9408d35e644c" containerName="registry-server" Sep 30 14:14:10 crc kubenswrapper[4763]: I0930 14:14:10.755834 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:10 crc kubenswrapper[4763]: I0930 14:14:10.761744 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9rhb"] Sep 30 14:14:10 crc kubenswrapper[4763]: I0930 14:14:10.916852 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lhkl\" (UniqueName: \"kubernetes.io/projected/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-kube-api-access-4lhkl\") pod \"certified-operators-h9rhb\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:10 crc kubenswrapper[4763]: I0930 14:14:10.916903 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-catalog-content\") pod \"certified-operators-h9rhb\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:10 crc kubenswrapper[4763]: I0930 14:14:10.917014 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-utilities\") pod \"certified-operators-h9rhb\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.018570 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-utilities\") pod \"certified-operators-h9rhb\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.018977 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lhkl\" (UniqueName: \"kubernetes.io/projected/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-kube-api-access-4lhkl\") pod \"certified-operators-h9rhb\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.019098 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-catalog-content\") pod \"certified-operators-h9rhb\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.019138 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-utilities\") pod \"certified-operators-h9rhb\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.019384 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-catalog-content\") pod \"certified-operators-h9rhb\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.043142 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lhkl\" (UniqueName: \"kubernetes.io/projected/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-kube-api-access-4lhkl\") pod \"certified-operators-h9rhb\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.093702 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.385453 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9rhb"] Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.713890 4763 generic.go:334] "Generic (PLEG): container finished" podID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerID="881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a" exitCode=0 Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.713963 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9rhb" event={"ID":"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f","Type":"ContainerDied","Data":"881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a"} Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.714046 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9rhb" event={"ID":"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f","Type":"ContainerStarted","Data":"fd2153f553644dd621551491fd7391690f80103331ed5629e6d0bdc0635006d8"} Sep 30 14:14:11 crc kubenswrapper[4763]: I0930 14:14:11.727654 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:14:12 crc kubenswrapper[4763]: I0930 14:14:12.726463 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9rhb" event={"ID":"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f","Type":"ContainerStarted","Data":"2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a"} Sep 30 14:14:13 crc kubenswrapper[4763]: I0930 14:14:13.734441 4763 generic.go:334] "Generic (PLEG): container finished" podID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerID="2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a" exitCode=0 Sep 30 14:14:13 crc kubenswrapper[4763]: I0930 14:14:13.734482 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9rhb" event={"ID":"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f","Type":"ContainerDied","Data":"2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a"} Sep 30 14:14:14 crc kubenswrapper[4763]: I0930 14:14:14.489981 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:14:14 crc kubenswrapper[4763]: E0930 14:14:14.490504 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:14:14 crc kubenswrapper[4763]: I0930 14:14:14.745694 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9rhb" event={"ID":"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f","Type":"ContainerStarted","Data":"526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6"} Sep 30 14:14:14 crc kubenswrapper[4763]: I0930 14:14:14.764426 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h9rhb" podStartSLOduration=2.15663445 podStartE2EDuration="4.764407266s" podCreationTimestamp="2025-09-30 14:14:10 +0000 UTC" firstStartedPulling="2025-09-30 14:14:11.727194082 +0000 UTC m=+2323.865754397" lastFinishedPulling="2025-09-30 14:14:14.334966928 +0000 UTC m=+2326.473527213" observedRunningTime="2025-09-30 14:14:14.762363735 +0000 UTC m=+2326.900924020" watchObservedRunningTime="2025-09-30 14:14:14.764407266 +0000 UTC m=+2326.902967551" Sep 30 14:14:21 crc kubenswrapper[4763]: I0930 14:14:21.094793 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:21 crc kubenswrapper[4763]: I0930 14:14:21.096161 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:21 crc kubenswrapper[4763]: I0930 14:14:21.138323 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:21 crc kubenswrapper[4763]: I0930 14:14:21.846423 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:21 crc kubenswrapper[4763]: I0930 14:14:21.894770 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9rhb"] Sep 30 14:14:23 crc kubenswrapper[4763]: I0930 14:14:23.816574 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h9rhb" podUID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerName="registry-server" containerID="cri-o://526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6" gracePeriod=2 Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.179211 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.303470 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-utilities\") pod \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.303537 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-catalog-content\") pod \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.303582 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lhkl\" (UniqueName: \"kubernetes.io/projected/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-kube-api-access-4lhkl\") pod \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\" (UID: \"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f\") " Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.305178 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-utilities" (OuterVolumeSpecName: "utilities") pod "8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" (UID: "8db8b043-51c2-4d09-ae2d-a9ff3e2f328f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.308745 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-kube-api-access-4lhkl" (OuterVolumeSpecName: "kube-api-access-4lhkl") pod "8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" (UID: "8db8b043-51c2-4d09-ae2d-a9ff3e2f328f"). InnerVolumeSpecName "kube-api-access-4lhkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.356097 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" (UID: "8db8b043-51c2-4d09-ae2d-a9ff3e2f328f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.405254 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.405294 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.405312 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lhkl\" (UniqueName: \"kubernetes.io/projected/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f-kube-api-access-4lhkl\") on node \"crc\" DevicePath \"\"" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.824951 4763 generic.go:334] "Generic (PLEG): container finished" podID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerID="526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6" exitCode=0 Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.825015 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9rhb" event={"ID":"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f","Type":"ContainerDied","Data":"526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6"} Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.825089 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9rhb" event={"ID":"8db8b043-51c2-4d09-ae2d-a9ff3e2f328f","Type":"ContainerDied","Data":"fd2153f553644dd621551491fd7391690f80103331ed5629e6d0bdc0635006d8"} Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.825130 4763 scope.go:117] "RemoveContainer" containerID="526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.825045 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9rhb" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.845017 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9rhb"] Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.847463 4763 scope.go:117] "RemoveContainer" containerID="2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.849764 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h9rhb"] Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.869084 4763 scope.go:117] "RemoveContainer" containerID="881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.887736 4763 scope.go:117] "RemoveContainer" containerID="526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6" Sep 30 14:14:24 crc kubenswrapper[4763]: E0930 14:14:24.888196 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6\": container with ID starting with 526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6 not found: ID does not exist" containerID="526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.888239 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6"} err="failed to get container status \"526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6\": rpc error: code = NotFound desc = could not find container \"526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6\": container with ID starting with 526a7296b27ef93dce3ba546562090d244052d312d04a02a170860ecb374e1c6 not found: ID does not exist" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.888269 4763 scope.go:117] "RemoveContainer" containerID="2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a" Sep 30 14:14:24 crc kubenswrapper[4763]: E0930 14:14:24.888660 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a\": container with ID starting with 2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a not found: ID does not exist" containerID="2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.888682 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a"} err="failed to get container status \"2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a\": rpc error: code = NotFound desc = could not find container \"2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a\": container with ID starting with 2508a18833c716e8db1d76eb795ad664723812fcfd97dab931a33509fe16181a not found: ID does not exist" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.888694 4763 scope.go:117] "RemoveContainer" containerID="881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a" Sep 30 14:14:24 crc kubenswrapper[4763]: E0930 14:14:24.888985 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a\": container with ID starting with 881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a not found: ID does not exist" containerID="881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a" Sep 30 14:14:24 crc kubenswrapper[4763]: I0930 14:14:24.889054 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a"} err="failed to get container status \"881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a\": rpc error: code = NotFound desc = could not find container \"881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a\": container with ID starting with 881de04bf08f740568dbfe4b10d5e8fe31750f249239f66085c6f820c4404a6a not found: ID does not exist" Sep 30 14:14:26 crc kubenswrapper[4763]: I0930 14:14:26.499366 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" path="/var/lib/kubelet/pods/8db8b043-51c2-4d09-ae2d-a9ff3e2f328f/volumes" Sep 30 14:14:28 crc kubenswrapper[4763]: I0930 14:14:28.495280 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:14:28 crc kubenswrapper[4763]: E0930 14:14:28.495636 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:14:43 crc kubenswrapper[4763]: I0930 14:14:43.489123 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:14:43 crc kubenswrapper[4763]: E0930 14:14:43.489756 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:14:56 crc kubenswrapper[4763]: I0930 14:14:56.489675 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:14:56 crc kubenswrapper[4763]: E0930 14:14:56.490398 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.141556 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh"] Sep 30 14:15:00 crc kubenswrapper[4763]: E0930 14:15:00.143360 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerName="extract-utilities" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.143497 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerName="extract-utilities" Sep 30 14:15:00 crc kubenswrapper[4763]: E0930 14:15:00.143591 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerName="registry-server" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.143731 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerName="registry-server" Sep 30 14:15:00 crc kubenswrapper[4763]: E0930 14:15:00.143880 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerName="extract-content" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.143993 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerName="extract-content" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.144411 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db8b043-51c2-4d09-ae2d-a9ff3e2f328f" containerName="registry-server" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.145636 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.147774 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.148096 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.150410 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh"] Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.319213 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/57d1349e-ebb0-4e6c-96db-7b27f9a56494-secret-volume\") pod \"collect-profiles-29320695-9h2wh\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.319776 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwtcr\" (UniqueName: \"kubernetes.io/projected/57d1349e-ebb0-4e6c-96db-7b27f9a56494-kube-api-access-vwtcr\") pod \"collect-profiles-29320695-9h2wh\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.319946 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d1349e-ebb0-4e6c-96db-7b27f9a56494-config-volume\") pod \"collect-profiles-29320695-9h2wh\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.421919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d1349e-ebb0-4e6c-96db-7b27f9a56494-config-volume\") pod \"collect-profiles-29320695-9h2wh\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.421974 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/57d1349e-ebb0-4e6c-96db-7b27f9a56494-secret-volume\") pod \"collect-profiles-29320695-9h2wh\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.422030 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwtcr\" (UniqueName: \"kubernetes.io/projected/57d1349e-ebb0-4e6c-96db-7b27f9a56494-kube-api-access-vwtcr\") pod \"collect-profiles-29320695-9h2wh\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.422768 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d1349e-ebb0-4e6c-96db-7b27f9a56494-config-volume\") pod \"collect-profiles-29320695-9h2wh\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.428694 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/57d1349e-ebb0-4e6c-96db-7b27f9a56494-secret-volume\") pod \"collect-profiles-29320695-9h2wh\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.437408 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwtcr\" (UniqueName: \"kubernetes.io/projected/57d1349e-ebb0-4e6c-96db-7b27f9a56494-kube-api-access-vwtcr\") pod \"collect-profiles-29320695-9h2wh\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.466635 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:00 crc kubenswrapper[4763]: I0930 14:15:00.913448 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh"] Sep 30 14:15:01 crc kubenswrapper[4763]: I0930 14:15:01.077040 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" event={"ID":"57d1349e-ebb0-4e6c-96db-7b27f9a56494","Type":"ContainerStarted","Data":"0bee90b678e8e4caae0c8b6e7d19ba366b4649145f9838d7b6b5b4170c014b0d"} Sep 30 14:15:01 crc kubenswrapper[4763]: I0930 14:15:01.077086 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" event={"ID":"57d1349e-ebb0-4e6c-96db-7b27f9a56494","Type":"ContainerStarted","Data":"a17337aa13fea793b5ff7fbde652538cdaa83cb49d3c76c903a7784b9285c5da"} Sep 30 14:15:01 crc kubenswrapper[4763]: I0930 14:15:01.092935 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" podStartSLOduration=1.092914095 podStartE2EDuration="1.092914095s" podCreationTimestamp="2025-09-30 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:15:01.09029533 +0000 UTC m=+2373.228855615" watchObservedRunningTime="2025-09-30 14:15:01.092914095 +0000 UTC m=+2373.231474380" Sep 30 14:15:02 crc kubenswrapper[4763]: I0930 14:15:02.083871 4763 generic.go:334] "Generic (PLEG): container finished" podID="57d1349e-ebb0-4e6c-96db-7b27f9a56494" containerID="0bee90b678e8e4caae0c8b6e7d19ba366b4649145f9838d7b6b5b4170c014b0d" exitCode=0 Sep 30 14:15:02 crc kubenswrapper[4763]: I0930 14:15:02.083911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" event={"ID":"57d1349e-ebb0-4e6c-96db-7b27f9a56494","Type":"ContainerDied","Data":"0bee90b678e8e4caae0c8b6e7d19ba366b4649145f9838d7b6b5b4170c014b0d"} Sep 30 14:15:03 crc kubenswrapper[4763]: I0930 14:15:03.447750 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:03 crc kubenswrapper[4763]: I0930 14:15:03.587181 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwtcr\" (UniqueName: \"kubernetes.io/projected/57d1349e-ebb0-4e6c-96db-7b27f9a56494-kube-api-access-vwtcr\") pod \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " Sep 30 14:15:03 crc kubenswrapper[4763]: I0930 14:15:03.587274 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/57d1349e-ebb0-4e6c-96db-7b27f9a56494-secret-volume\") pod \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " Sep 30 14:15:03 crc kubenswrapper[4763]: I0930 14:15:03.587345 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d1349e-ebb0-4e6c-96db-7b27f9a56494-config-volume\") pod \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\" (UID: \"57d1349e-ebb0-4e6c-96db-7b27f9a56494\") " Sep 30 14:15:03 crc kubenswrapper[4763]: I0930 14:15:03.587954 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d1349e-ebb0-4e6c-96db-7b27f9a56494-config-volume" (OuterVolumeSpecName: "config-volume") pod "57d1349e-ebb0-4e6c-96db-7b27f9a56494" (UID: "57d1349e-ebb0-4e6c-96db-7b27f9a56494"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:15:03 crc kubenswrapper[4763]: I0930 14:15:03.588210 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d1349e-ebb0-4e6c-96db-7b27f9a56494-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:03 crc kubenswrapper[4763]: I0930 14:15:03.591960 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d1349e-ebb0-4e6c-96db-7b27f9a56494-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "57d1349e-ebb0-4e6c-96db-7b27f9a56494" (UID: "57d1349e-ebb0-4e6c-96db-7b27f9a56494"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:15:03 crc kubenswrapper[4763]: I0930 14:15:03.592692 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d1349e-ebb0-4e6c-96db-7b27f9a56494-kube-api-access-vwtcr" (OuterVolumeSpecName: "kube-api-access-vwtcr") pod "57d1349e-ebb0-4e6c-96db-7b27f9a56494" (UID: "57d1349e-ebb0-4e6c-96db-7b27f9a56494"). InnerVolumeSpecName "kube-api-access-vwtcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:15:03 crc kubenswrapper[4763]: I0930 14:15:03.689441 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwtcr\" (UniqueName: \"kubernetes.io/projected/57d1349e-ebb0-4e6c-96db-7b27f9a56494-kube-api-access-vwtcr\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:03 crc kubenswrapper[4763]: I0930 14:15:03.689493 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/57d1349e-ebb0-4e6c-96db-7b27f9a56494-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:04 crc kubenswrapper[4763]: I0930 14:15:04.099996 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" event={"ID":"57d1349e-ebb0-4e6c-96db-7b27f9a56494","Type":"ContainerDied","Data":"a17337aa13fea793b5ff7fbde652538cdaa83cb49d3c76c903a7784b9285c5da"} Sep 30 14:15:04 crc kubenswrapper[4763]: I0930 14:15:04.100321 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a17337aa13fea793b5ff7fbde652538cdaa83cb49d3c76c903a7784b9285c5da" Sep 30 14:15:04 crc kubenswrapper[4763]: I0930 14:15:04.100058 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh" Sep 30 14:15:04 crc kubenswrapper[4763]: I0930 14:15:04.514349 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw"] Sep 30 14:15:04 crc kubenswrapper[4763]: I0930 14:15:04.521134 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-kpklw"] Sep 30 14:15:06 crc kubenswrapper[4763]: I0930 14:15:06.500543 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a540445-8589-4437-b134-38ba9d38faf0" path="/var/lib/kubelet/pods/6a540445-8589-4437-b134-38ba9d38faf0/volumes" Sep 30 14:15:07 crc kubenswrapper[4763]: I0930 14:15:07.489818 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:15:07 crc kubenswrapper[4763]: E0930 14:15:07.490144 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:15:22 crc kubenswrapper[4763]: I0930 14:15:22.489955 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:15:22 crc kubenswrapper[4763]: E0930 14:15:22.490660 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:15:32 crc kubenswrapper[4763]: I0930 14:15:32.907378 4763 scope.go:117] "RemoveContainer" containerID="9ead3cce3b8bef2eff1fd9882d990535e94a6bcd8889b85de032de55a378c287" Sep 30 14:15:36 crc kubenswrapper[4763]: I0930 14:15:36.489261 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:15:36 crc kubenswrapper[4763]: E0930 14:15:36.489853 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:15:48 crc kubenswrapper[4763]: I0930 14:15:48.494119 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:15:48 crc kubenswrapper[4763]: E0930 14:15:48.494949 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:16:01 crc kubenswrapper[4763]: I0930 14:16:01.489519 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:16:01 crc kubenswrapper[4763]: E0930 14:16:01.490546 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:16:16 crc kubenswrapper[4763]: I0930 14:16:16.490001 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:16:16 crc kubenswrapper[4763]: E0930 14:16:16.490705 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:16:31 crc kubenswrapper[4763]: I0930 14:16:31.489456 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:16:31 crc kubenswrapper[4763]: E0930 14:16:31.490408 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:16:44 crc kubenswrapper[4763]: I0930 14:16:44.489369 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:16:44 crc kubenswrapper[4763]: E0930 14:16:44.490292 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:16:56 crc kubenswrapper[4763]: I0930 14:16:56.489073 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:16:56 crc kubenswrapper[4763]: E0930 14:16:56.489826 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:17:10 crc kubenswrapper[4763]: I0930 14:17:10.490484 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:17:10 crc kubenswrapper[4763]: I0930 14:17:10.977827 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"a56544cbcc64bd3b357bedf67d6269d459c5e046c0ab64cd6996477ee180191a"} Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.726184 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4l75n"] Sep 30 14:19:35 crc kubenswrapper[4763]: E0930 14:19:35.727190 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d1349e-ebb0-4e6c-96db-7b27f9a56494" containerName="collect-profiles" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.727209 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d1349e-ebb0-4e6c-96db-7b27f9a56494" containerName="collect-profiles" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.727388 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d1349e-ebb0-4e6c-96db-7b27f9a56494" containerName="collect-profiles" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.728645 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.733148 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7fcc\" (UniqueName: \"kubernetes.io/projected/790228ee-74b1-4a45-a9d5-2535489eef9d-kube-api-access-j7fcc\") pod \"redhat-marketplace-4l75n\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.733210 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-catalog-content\") pod \"redhat-marketplace-4l75n\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.733325 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-utilities\") pod \"redhat-marketplace-4l75n\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.738883 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4l75n"] Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.834310 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-utilities\") pod \"redhat-marketplace-4l75n\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.834414 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7fcc\" (UniqueName: \"kubernetes.io/projected/790228ee-74b1-4a45-a9d5-2535489eef9d-kube-api-access-j7fcc\") pod \"redhat-marketplace-4l75n\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.834473 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-catalog-content\") pod \"redhat-marketplace-4l75n\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.834947 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-utilities\") pod \"redhat-marketplace-4l75n\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.835023 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-catalog-content\") pod \"redhat-marketplace-4l75n\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:35 crc kubenswrapper[4763]: I0930 14:19:35.855847 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7fcc\" (UniqueName: \"kubernetes.io/projected/790228ee-74b1-4a45-a9d5-2535489eef9d-kube-api-access-j7fcc\") pod \"redhat-marketplace-4l75n\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:36 crc kubenswrapper[4763]: I0930 14:19:36.044024 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:36 crc kubenswrapper[4763]: I0930 14:19:36.059760 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:19:36 crc kubenswrapper[4763]: I0930 14:19:36.059823 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:19:36 crc kubenswrapper[4763]: I0930 14:19:36.238304 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4l75n"] Sep 30 14:19:37 crc kubenswrapper[4763]: I0930 14:19:37.012792 4763 generic.go:334] "Generic (PLEG): container finished" podID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerID="761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8" exitCode=0 Sep 30 14:19:37 crc kubenswrapper[4763]: I0930 14:19:37.012874 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l75n" event={"ID":"790228ee-74b1-4a45-a9d5-2535489eef9d","Type":"ContainerDied","Data":"761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8"} Sep 30 14:19:37 crc kubenswrapper[4763]: I0930 14:19:37.013934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l75n" event={"ID":"790228ee-74b1-4a45-a9d5-2535489eef9d","Type":"ContainerStarted","Data":"2acaec2d1c3a23330ecaa35722cc82931fd2fa21d5feab4499d108594dbd2ef5"} Sep 30 14:19:37 crc kubenswrapper[4763]: I0930 14:19:37.014819 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:19:38 crc kubenswrapper[4763]: I0930 14:19:38.027443 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l75n" event={"ID":"790228ee-74b1-4a45-a9d5-2535489eef9d","Type":"ContainerStarted","Data":"63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c"} Sep 30 14:19:39 crc kubenswrapper[4763]: I0930 14:19:39.037734 4763 generic.go:334] "Generic (PLEG): container finished" podID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerID="63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c" exitCode=0 Sep 30 14:19:39 crc kubenswrapper[4763]: I0930 14:19:39.037851 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l75n" event={"ID":"790228ee-74b1-4a45-a9d5-2535489eef9d","Type":"ContainerDied","Data":"63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c"} Sep 30 14:19:40 crc kubenswrapper[4763]: I0930 14:19:40.048215 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l75n" event={"ID":"790228ee-74b1-4a45-a9d5-2535489eef9d","Type":"ContainerStarted","Data":"75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46"} Sep 30 14:19:40 crc kubenswrapper[4763]: I0930 14:19:40.067278 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4l75n" podStartSLOduration=2.512754379 podStartE2EDuration="5.067257876s" podCreationTimestamp="2025-09-30 14:19:35 +0000 UTC" firstStartedPulling="2025-09-30 14:19:37.014530539 +0000 UTC m=+2649.153090824" lastFinishedPulling="2025-09-30 14:19:39.569034036 +0000 UTC m=+2651.707594321" observedRunningTime="2025-09-30 14:19:40.06622673 +0000 UTC m=+2652.204787015" watchObservedRunningTime="2025-09-30 14:19:40.067257876 +0000 UTC m=+2652.205818161" Sep 30 14:19:46 crc kubenswrapper[4763]: I0930 14:19:46.044945 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:46 crc kubenswrapper[4763]: I0930 14:19:46.045510 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:46 crc kubenswrapper[4763]: I0930 14:19:46.091480 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:46 crc kubenswrapper[4763]: I0930 14:19:46.150468 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:46 crc kubenswrapper[4763]: I0930 14:19:46.331564 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4l75n"] Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.103671 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4l75n" podUID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerName="registry-server" containerID="cri-o://75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46" gracePeriod=2 Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.473740 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.624828 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-utilities\") pod \"790228ee-74b1-4a45-a9d5-2535489eef9d\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.624949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7fcc\" (UniqueName: \"kubernetes.io/projected/790228ee-74b1-4a45-a9d5-2535489eef9d-kube-api-access-j7fcc\") pod \"790228ee-74b1-4a45-a9d5-2535489eef9d\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.624993 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-catalog-content\") pod \"790228ee-74b1-4a45-a9d5-2535489eef9d\" (UID: \"790228ee-74b1-4a45-a9d5-2535489eef9d\") " Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.626316 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-utilities" (OuterVolumeSpecName: "utilities") pod "790228ee-74b1-4a45-a9d5-2535489eef9d" (UID: "790228ee-74b1-4a45-a9d5-2535489eef9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.630200 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790228ee-74b1-4a45-a9d5-2535489eef9d-kube-api-access-j7fcc" (OuterVolumeSpecName: "kube-api-access-j7fcc") pod "790228ee-74b1-4a45-a9d5-2535489eef9d" (UID: "790228ee-74b1-4a45-a9d5-2535489eef9d"). InnerVolumeSpecName "kube-api-access-j7fcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.637313 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "790228ee-74b1-4a45-a9d5-2535489eef9d" (UID: "790228ee-74b1-4a45-a9d5-2535489eef9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.726713 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7fcc\" (UniqueName: \"kubernetes.io/projected/790228ee-74b1-4a45-a9d5-2535489eef9d-kube-api-access-j7fcc\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.726762 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:48 crc kubenswrapper[4763]: I0930 14:19:48.727213 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790228ee-74b1-4a45-a9d5-2535489eef9d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.114114 4763 generic.go:334] "Generic (PLEG): container finished" podID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerID="75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46" exitCode=0 Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.114166 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l75n" event={"ID":"790228ee-74b1-4a45-a9d5-2535489eef9d","Type":"ContainerDied","Data":"75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46"} Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.114204 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l75n" event={"ID":"790228ee-74b1-4a45-a9d5-2535489eef9d","Type":"ContainerDied","Data":"2acaec2d1c3a23330ecaa35722cc82931fd2fa21d5feab4499d108594dbd2ef5"} Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.114227 4763 scope.go:117] "RemoveContainer" containerID="75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46" Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.114274 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4l75n" Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.140730 4763 scope.go:117] "RemoveContainer" containerID="63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c" Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.165121 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4l75n"] Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.167384 4763 scope.go:117] "RemoveContainer" containerID="761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8" Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.169838 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4l75n"] Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.190153 4763 scope.go:117] "RemoveContainer" containerID="75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46" Sep 30 14:19:49 crc kubenswrapper[4763]: E0930 14:19:49.190618 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46\": container with ID starting with 75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46 not found: ID does not exist" containerID="75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46" Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.190649 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46"} err="failed to get container status \"75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46\": rpc error: code = NotFound desc = could not find container \"75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46\": container with ID starting with 75be5095df2d8059f89efd591b89495160f611a92e25ad1860cabec35800df46 not found: ID does not exist" Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.190671 4763 scope.go:117] "RemoveContainer" containerID="63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c" Sep 30 14:19:49 crc kubenswrapper[4763]: E0930 14:19:49.190873 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c\": container with ID starting with 63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c not found: ID does not exist" containerID="63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c" Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.190896 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c"} err="failed to get container status \"63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c\": rpc error: code = NotFound desc = could not find container \"63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c\": container with ID starting with 63da9d25f7e16adff0e0c655931eba2fa0adbf481491c8aa62fd65aef3d37d6c not found: ID does not exist" Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.190911 4763 scope.go:117] "RemoveContainer" containerID="761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8" Sep 30 14:19:49 crc kubenswrapper[4763]: E0930 14:19:49.191071 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8\": container with ID starting with 761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8 not found: ID does not exist" containerID="761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8" Sep 30 14:19:49 crc kubenswrapper[4763]: I0930 14:19:49.191097 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8"} err="failed to get container status \"761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8\": rpc error: code = NotFound desc = could not find container \"761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8\": container with ID starting with 761cd7d8140c28dc9d9feb7bfdc0e3983901f831bede0e321066f297354a74b8 not found: ID does not exist" Sep 30 14:19:50 crc kubenswrapper[4763]: I0930 14:19:50.498341 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790228ee-74b1-4a45-a9d5-2535489eef9d" path="/var/lib/kubelet/pods/790228ee-74b1-4a45-a9d5-2535489eef9d/volumes" Sep 30 14:20:06 crc kubenswrapper[4763]: I0930 14:20:06.059762 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:20:06 crc kubenswrapper[4763]: I0930 14:20:06.060162 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:20:36 crc kubenswrapper[4763]: I0930 14:20:36.060742 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:20:36 crc kubenswrapper[4763]: I0930 14:20:36.063386 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:20:36 crc kubenswrapper[4763]: I0930 14:20:36.063764 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:20:36 crc kubenswrapper[4763]: I0930 14:20:36.065182 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a56544cbcc64bd3b357bedf67d6269d459c5e046c0ab64cd6996477ee180191a"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:20:36 crc kubenswrapper[4763]: I0930 14:20:36.065581 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://a56544cbcc64bd3b357bedf67d6269d459c5e046c0ab64cd6996477ee180191a" gracePeriod=600 Sep 30 14:20:36 crc kubenswrapper[4763]: I0930 14:20:36.438436 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="a56544cbcc64bd3b357bedf67d6269d459c5e046c0ab64cd6996477ee180191a" exitCode=0 Sep 30 14:20:36 crc kubenswrapper[4763]: I0930 14:20:36.438490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"a56544cbcc64bd3b357bedf67d6269d459c5e046c0ab64cd6996477ee180191a"} Sep 30 14:20:36 crc kubenswrapper[4763]: I0930 14:20:36.438914 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f"} Sep 30 14:20:36 crc kubenswrapper[4763]: I0930 14:20:36.438959 4763 scope.go:117] "RemoveContainer" containerID="a46baa0be819b83fbe8fc05aa370e7e00754713813d773ac26d3666731741d6e" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.092200 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g59f4"] Sep 30 14:21:28 crc kubenswrapper[4763]: E0930 14:21:28.095681 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerName="extract-utilities" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.095808 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerName="extract-utilities" Sep 30 14:21:28 crc kubenswrapper[4763]: E0930 14:21:28.095918 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerName="registry-server" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.095987 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerName="registry-server" Sep 30 14:21:28 crc kubenswrapper[4763]: E0930 14:21:28.096075 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerName="extract-content" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.096142 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerName="extract-content" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.096640 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="790228ee-74b1-4a45-a9d5-2535489eef9d" containerName="registry-server" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.103555 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.113912 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g59f4"] Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.192189 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-utilities\") pod \"community-operators-g59f4\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.192246 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2plp4\" (UniqueName: \"kubernetes.io/projected/638203ed-fe2a-4f71-91bf-cf563e8ba65c-kube-api-access-2plp4\") pod \"community-operators-g59f4\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.192299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-catalog-content\") pod \"community-operators-g59f4\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.293890 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-catalog-content\") pod \"community-operators-g59f4\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.294038 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-utilities\") pod \"community-operators-g59f4\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.294064 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2plp4\" (UniqueName: \"kubernetes.io/projected/638203ed-fe2a-4f71-91bf-cf563e8ba65c-kube-api-access-2plp4\") pod \"community-operators-g59f4\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.294563 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-catalog-content\") pod \"community-operators-g59f4\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.294713 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-utilities\") pod \"community-operators-g59f4\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.315394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2plp4\" (UniqueName: \"kubernetes.io/projected/638203ed-fe2a-4f71-91bf-cf563e8ba65c-kube-api-access-2plp4\") pod \"community-operators-g59f4\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.433663 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:28 crc kubenswrapper[4763]: I0930 14:21:28.878729 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g59f4"] Sep 30 14:21:29 crc kubenswrapper[4763]: I0930 14:21:29.816064 4763 generic.go:334] "Generic (PLEG): container finished" podID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerID="884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6" exitCode=0 Sep 30 14:21:29 crc kubenswrapper[4763]: I0930 14:21:29.816171 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59f4" event={"ID":"638203ed-fe2a-4f71-91bf-cf563e8ba65c","Type":"ContainerDied","Data":"884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6"} Sep 30 14:21:29 crc kubenswrapper[4763]: I0930 14:21:29.816374 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59f4" event={"ID":"638203ed-fe2a-4f71-91bf-cf563e8ba65c","Type":"ContainerStarted","Data":"94e7e77a06b50b66d0d2f62c26167eab983727da5dbc5bd41269194cb53509d5"} Sep 30 14:21:30 crc kubenswrapper[4763]: I0930 14:21:30.827471 4763 generic.go:334] "Generic (PLEG): container finished" podID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerID="c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641" exitCode=0 Sep 30 14:21:30 crc kubenswrapper[4763]: I0930 14:21:30.827563 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59f4" event={"ID":"638203ed-fe2a-4f71-91bf-cf563e8ba65c","Type":"ContainerDied","Data":"c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641"} Sep 30 14:21:31 crc kubenswrapper[4763]: I0930 14:21:31.839312 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59f4" event={"ID":"638203ed-fe2a-4f71-91bf-cf563e8ba65c","Type":"ContainerStarted","Data":"770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae"} Sep 30 14:21:31 crc kubenswrapper[4763]: I0930 14:21:31.858197 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g59f4" podStartSLOduration=2.365181445 podStartE2EDuration="3.858174691s" podCreationTimestamp="2025-09-30 14:21:28 +0000 UTC" firstStartedPulling="2025-09-30 14:21:29.818494337 +0000 UTC m=+2761.957054622" lastFinishedPulling="2025-09-30 14:21:31.311487593 +0000 UTC m=+2763.450047868" observedRunningTime="2025-09-30 14:21:31.855519145 +0000 UTC m=+2763.994079430" watchObservedRunningTime="2025-09-30 14:21:31.858174691 +0000 UTC m=+2763.996734986" Sep 30 14:21:38 crc kubenswrapper[4763]: I0930 14:21:38.434051 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:38 crc kubenswrapper[4763]: I0930 14:21:38.434529 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:38 crc kubenswrapper[4763]: I0930 14:21:38.477299 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:38 crc kubenswrapper[4763]: I0930 14:21:38.954376 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:39 crc kubenswrapper[4763]: I0930 14:21:39.012864 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g59f4"] Sep 30 14:21:40 crc kubenswrapper[4763]: I0930 14:21:40.920218 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g59f4" podUID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerName="registry-server" containerID="cri-o://770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae" gracePeriod=2 Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.818250 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.888661 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2plp4\" (UniqueName: \"kubernetes.io/projected/638203ed-fe2a-4f71-91bf-cf563e8ba65c-kube-api-access-2plp4\") pod \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.888836 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-catalog-content\") pod \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.888874 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-utilities\") pod \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\" (UID: \"638203ed-fe2a-4f71-91bf-cf563e8ba65c\") " Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.890737 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-utilities" (OuterVolumeSpecName: "utilities") pod "638203ed-fe2a-4f71-91bf-cf563e8ba65c" (UID: "638203ed-fe2a-4f71-91bf-cf563e8ba65c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.896208 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/638203ed-fe2a-4f71-91bf-cf563e8ba65c-kube-api-access-2plp4" (OuterVolumeSpecName: "kube-api-access-2plp4") pod "638203ed-fe2a-4f71-91bf-cf563e8ba65c" (UID: "638203ed-fe2a-4f71-91bf-cf563e8ba65c"). InnerVolumeSpecName "kube-api-access-2plp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.941108 4763 generic.go:334] "Generic (PLEG): container finished" podID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerID="770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae" exitCode=0 Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.941166 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59f4" event={"ID":"638203ed-fe2a-4f71-91bf-cf563e8ba65c","Type":"ContainerDied","Data":"770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae"} Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.941208 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59f4" event={"ID":"638203ed-fe2a-4f71-91bf-cf563e8ba65c","Type":"ContainerDied","Data":"94e7e77a06b50b66d0d2f62c26167eab983727da5dbc5bd41269194cb53509d5"} Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.941230 4763 scope.go:117] "RemoveContainer" containerID="770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae" Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.941445 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g59f4" Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.967023 4763 scope.go:117] "RemoveContainer" containerID="c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641" Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.974887 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "638203ed-fe2a-4f71-91bf-cf563e8ba65c" (UID: "638203ed-fe2a-4f71-91bf-cf563e8ba65c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.988418 4763 scope.go:117] "RemoveContainer" containerID="884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6" Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.991211 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2plp4\" (UniqueName: \"kubernetes.io/projected/638203ed-fe2a-4f71-91bf-cf563e8ba65c-kube-api-access-2plp4\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.991257 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:41 crc kubenswrapper[4763]: I0930 14:21:41.991269 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/638203ed-fe2a-4f71-91bf-cf563e8ba65c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:42 crc kubenswrapper[4763]: I0930 14:21:42.014287 4763 scope.go:117] "RemoveContainer" containerID="770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae" Sep 30 14:21:42 crc kubenswrapper[4763]: E0930 14:21:42.015196 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae\": container with ID starting with 770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae not found: ID does not exist" containerID="770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae" Sep 30 14:21:42 crc kubenswrapper[4763]: I0930 14:21:42.015239 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae"} err="failed to get container status \"770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae\": rpc error: code = NotFound desc = could not find container \"770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae\": container with ID starting with 770274b0e3de964fb6f0b4affc9441023556dbd772cbfa7ef5cf1536727011ae not found: ID does not exist" Sep 30 14:21:42 crc kubenswrapper[4763]: I0930 14:21:42.015264 4763 scope.go:117] "RemoveContainer" containerID="c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641" Sep 30 14:21:42 crc kubenswrapper[4763]: E0930 14:21:42.015512 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641\": container with ID starting with c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641 not found: ID does not exist" containerID="c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641" Sep 30 14:21:42 crc kubenswrapper[4763]: I0930 14:21:42.015536 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641"} err="failed to get container status \"c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641\": rpc error: code = NotFound desc = could not find container \"c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641\": container with ID starting with c84ae5e98dcbceb247edff7d9fa5352b54880ea1b5f97b2052b16ac467541641 not found: ID does not exist" Sep 30 14:21:42 crc kubenswrapper[4763]: I0930 14:21:42.015550 4763 scope.go:117] "RemoveContainer" containerID="884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6" Sep 30 14:21:42 crc kubenswrapper[4763]: E0930 14:21:42.015765 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6\": container with ID starting with 884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6 not found: ID does not exist" containerID="884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6" Sep 30 14:21:42 crc kubenswrapper[4763]: I0930 14:21:42.015811 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6"} err="failed to get container status \"884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6\": rpc error: code = NotFound desc = could not find container \"884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6\": container with ID starting with 884c4d2fcbec2da72f7dd5f8e39ffffe44aa3d233ec2eacddb3f17ad63838bb6 not found: ID does not exist" Sep 30 14:21:42 crc kubenswrapper[4763]: I0930 14:21:42.282302 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g59f4"] Sep 30 14:21:42 crc kubenswrapper[4763]: I0930 14:21:42.287376 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g59f4"] Sep 30 14:21:42 crc kubenswrapper[4763]: I0930 14:21:42.507589 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" path="/var/lib/kubelet/pods/638203ed-fe2a-4f71-91bf-cf563e8ba65c/volumes" Sep 30 14:22:36 crc kubenswrapper[4763]: I0930 14:22:36.059865 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:22:36 crc kubenswrapper[4763]: I0930 14:22:36.060483 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:23:06 crc kubenswrapper[4763]: I0930 14:23:06.059891 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:23:06 crc kubenswrapper[4763]: I0930 14:23:06.060406 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.681987 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m6jz5"] Sep 30 14:23:09 crc kubenswrapper[4763]: E0930 14:23:09.682784 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerName="registry-server" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.682802 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerName="registry-server" Sep 30 14:23:09 crc kubenswrapper[4763]: E0930 14:23:09.682832 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerName="extract-content" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.682841 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerName="extract-content" Sep 30 14:23:09 crc kubenswrapper[4763]: E0930 14:23:09.682866 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerName="extract-utilities" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.682877 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerName="extract-utilities" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.683058 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="638203ed-fe2a-4f71-91bf-cf563e8ba65c" containerName="registry-server" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.684404 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.694837 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6jz5"] Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.778158 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-utilities\") pod \"redhat-operators-m6jz5\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.778255 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-catalog-content\") pod \"redhat-operators-m6jz5\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.778326 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf6z6\" (UniqueName: \"kubernetes.io/projected/bfee86c1-486b-4e4e-ac0d-5b1025238dca-kube-api-access-hf6z6\") pod \"redhat-operators-m6jz5\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.879708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-utilities\") pod \"redhat-operators-m6jz5\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.879826 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-catalog-content\") pod \"redhat-operators-m6jz5\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.879856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf6z6\" (UniqueName: \"kubernetes.io/projected/bfee86c1-486b-4e4e-ac0d-5b1025238dca-kube-api-access-hf6z6\") pod \"redhat-operators-m6jz5\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.880195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-utilities\") pod \"redhat-operators-m6jz5\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.880961 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-catalog-content\") pod \"redhat-operators-m6jz5\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:09 crc kubenswrapper[4763]: I0930 14:23:09.900077 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf6z6\" (UniqueName: \"kubernetes.io/projected/bfee86c1-486b-4e4e-ac0d-5b1025238dca-kube-api-access-hf6z6\") pod \"redhat-operators-m6jz5\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:10 crc kubenswrapper[4763]: I0930 14:23:10.005813 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:10 crc kubenswrapper[4763]: I0930 14:23:10.466512 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6jz5"] Sep 30 14:23:10 crc kubenswrapper[4763]: I0930 14:23:10.558774 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6jz5" event={"ID":"bfee86c1-486b-4e4e-ac0d-5b1025238dca","Type":"ContainerStarted","Data":"f211f66d6a651cc8b6a84fbc1bd45e3218da8f8c8738d6aaffd02507ec58fe3f"} Sep 30 14:23:11 crc kubenswrapper[4763]: I0930 14:23:11.567139 4763 generic.go:334] "Generic (PLEG): container finished" podID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerID="66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48" exitCode=0 Sep 30 14:23:11 crc kubenswrapper[4763]: I0930 14:23:11.567190 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6jz5" event={"ID":"bfee86c1-486b-4e4e-ac0d-5b1025238dca","Type":"ContainerDied","Data":"66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48"} Sep 30 14:23:13 crc kubenswrapper[4763]: I0930 14:23:13.591666 4763 generic.go:334] "Generic (PLEG): container finished" podID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerID="86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9" exitCode=0 Sep 30 14:23:13 crc kubenswrapper[4763]: I0930 14:23:13.592067 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6jz5" event={"ID":"bfee86c1-486b-4e4e-ac0d-5b1025238dca","Type":"ContainerDied","Data":"86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9"} Sep 30 14:23:14 crc kubenswrapper[4763]: I0930 14:23:14.600472 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6jz5" event={"ID":"bfee86c1-486b-4e4e-ac0d-5b1025238dca","Type":"ContainerStarted","Data":"9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d"} Sep 30 14:23:20 crc kubenswrapper[4763]: I0930 14:23:20.006505 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:20 crc kubenswrapper[4763]: I0930 14:23:20.007138 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:20 crc kubenswrapper[4763]: I0930 14:23:20.045345 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:20 crc kubenswrapper[4763]: I0930 14:23:20.067919 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m6jz5" podStartSLOduration=8.654067264 podStartE2EDuration="11.067897485s" podCreationTimestamp="2025-09-30 14:23:09 +0000 UTC" firstStartedPulling="2025-09-30 14:23:11.56945413 +0000 UTC m=+2863.708014415" lastFinishedPulling="2025-09-30 14:23:13.983284351 +0000 UTC m=+2866.121844636" observedRunningTime="2025-09-30 14:23:14.622891105 +0000 UTC m=+2866.761451380" watchObservedRunningTime="2025-09-30 14:23:20.067897485 +0000 UTC m=+2872.206457770" Sep 30 14:23:20 crc kubenswrapper[4763]: I0930 14:23:20.682039 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:20 crc kubenswrapper[4763]: I0930 14:23:20.720425 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m6jz5"] Sep 30 14:23:22 crc kubenswrapper[4763]: I0930 14:23:22.656920 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m6jz5" podUID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerName="registry-server" containerID="cri-o://9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d" gracePeriod=2 Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.033273 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.176959 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-utilities\") pod \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.177037 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf6z6\" (UniqueName: \"kubernetes.io/projected/bfee86c1-486b-4e4e-ac0d-5b1025238dca-kube-api-access-hf6z6\") pod \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.177117 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-catalog-content\") pod \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\" (UID: \"bfee86c1-486b-4e4e-ac0d-5b1025238dca\") " Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.178307 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-utilities" (OuterVolumeSpecName: "utilities") pod "bfee86c1-486b-4e4e-ac0d-5b1025238dca" (UID: "bfee86c1-486b-4e4e-ac0d-5b1025238dca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.183651 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfee86c1-486b-4e4e-ac0d-5b1025238dca-kube-api-access-hf6z6" (OuterVolumeSpecName: "kube-api-access-hf6z6") pod "bfee86c1-486b-4e4e-ac0d-5b1025238dca" (UID: "bfee86c1-486b-4e4e-ac0d-5b1025238dca"). InnerVolumeSpecName "kube-api-access-hf6z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.279578 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.279633 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf6z6\" (UniqueName: \"kubernetes.io/projected/bfee86c1-486b-4e4e-ac0d-5b1025238dca-kube-api-access-hf6z6\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.667561 4763 generic.go:334] "Generic (PLEG): container finished" podID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerID="9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d" exitCode=0 Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.667630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6jz5" event={"ID":"bfee86c1-486b-4e4e-ac0d-5b1025238dca","Type":"ContainerDied","Data":"9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d"} Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.667679 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6jz5" event={"ID":"bfee86c1-486b-4e4e-ac0d-5b1025238dca","Type":"ContainerDied","Data":"f211f66d6a651cc8b6a84fbc1bd45e3218da8f8c8738d6aaffd02507ec58fe3f"} Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.667698 4763 scope.go:117] "RemoveContainer" containerID="9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.667693 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6jz5" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.685896 4763 scope.go:117] "RemoveContainer" containerID="86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.704997 4763 scope.go:117] "RemoveContainer" containerID="66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.742889 4763 scope.go:117] "RemoveContainer" containerID="9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d" Sep 30 14:23:23 crc kubenswrapper[4763]: E0930 14:23:23.743536 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d\": container with ID starting with 9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d not found: ID does not exist" containerID="9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.743573 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d"} err="failed to get container status \"9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d\": rpc error: code = NotFound desc = could not find container \"9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d\": container with ID starting with 9bd30ff0d289e68f8eb48f84785183f012eee87a685ee092f5bdd61aba31d74d not found: ID does not exist" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.743616 4763 scope.go:117] "RemoveContainer" containerID="86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9" Sep 30 14:23:23 crc kubenswrapper[4763]: E0930 14:23:23.743902 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9\": container with ID starting with 86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9 not found: ID does not exist" containerID="86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.743926 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9"} err="failed to get container status \"86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9\": rpc error: code = NotFound desc = could not find container \"86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9\": container with ID starting with 86404546f2b02eaf5a79afa241695d2481d2cbb8e95f169d23c2e33cd835ddd9 not found: ID does not exist" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.743942 4763 scope.go:117] "RemoveContainer" containerID="66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48" Sep 30 14:23:23 crc kubenswrapper[4763]: E0930 14:23:23.744226 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48\": container with ID starting with 66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48 not found: ID does not exist" containerID="66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48" Sep 30 14:23:23 crc kubenswrapper[4763]: I0930 14:23:23.744279 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48"} err="failed to get container status \"66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48\": rpc error: code = NotFound desc = could not find container \"66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48\": container with ID starting with 66344ad42e9b74c965e378bebecfa862c1016edaefc4b2a90c9195759c2a5a48 not found: ID does not exist" Sep 30 14:23:24 crc kubenswrapper[4763]: I0930 14:23:24.603154 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfee86c1-486b-4e4e-ac0d-5b1025238dca" (UID: "bfee86c1-486b-4e4e-ac0d-5b1025238dca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:24 crc kubenswrapper[4763]: I0930 14:23:24.700498 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfee86c1-486b-4e4e-ac0d-5b1025238dca-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:24 crc kubenswrapper[4763]: I0930 14:23:24.901794 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m6jz5"] Sep 30 14:23:24 crc kubenswrapper[4763]: I0930 14:23:24.907056 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m6jz5"] Sep 30 14:23:26 crc kubenswrapper[4763]: I0930 14:23:26.498931 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" path="/var/lib/kubelet/pods/bfee86c1-486b-4e4e-ac0d-5b1025238dca/volumes" Sep 30 14:23:36 crc kubenswrapper[4763]: I0930 14:23:36.060052 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:23:36 crc kubenswrapper[4763]: I0930 14:23:36.060545 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:23:36 crc kubenswrapper[4763]: I0930 14:23:36.060591 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:23:36 crc kubenswrapper[4763]: I0930 14:23:36.061151 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:23:36 crc kubenswrapper[4763]: I0930 14:23:36.061203 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" gracePeriod=600 Sep 30 14:23:36 crc kubenswrapper[4763]: E0930 14:23:36.196002 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:23:36 crc kubenswrapper[4763]: I0930 14:23:36.764249 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" exitCode=0 Sep 30 14:23:36 crc kubenswrapper[4763]: I0930 14:23:36.764310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f"} Sep 30 14:23:36 crc kubenswrapper[4763]: I0930 14:23:36.764355 4763 scope.go:117] "RemoveContainer" containerID="a56544cbcc64bd3b357bedf67d6269d459c5e046c0ab64cd6996477ee180191a" Sep 30 14:23:36 crc kubenswrapper[4763]: I0930 14:23:36.764993 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:23:36 crc kubenswrapper[4763]: E0930 14:23:36.765469 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:23:48 crc kubenswrapper[4763]: I0930 14:23:48.493057 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:23:48 crc kubenswrapper[4763]: E0930 14:23:48.493890 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:24:03 crc kubenswrapper[4763]: I0930 14:24:03.488951 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:24:03 crc kubenswrapper[4763]: E0930 14:24:03.489715 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:24:17 crc kubenswrapper[4763]: I0930 14:24:17.490284 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:24:17 crc kubenswrapper[4763]: E0930 14:24:17.491091 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:24:29 crc kubenswrapper[4763]: I0930 14:24:29.489587 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:24:29 crc kubenswrapper[4763]: E0930 14:24:29.490295 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:24:43 crc kubenswrapper[4763]: I0930 14:24:43.489386 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:24:43 crc kubenswrapper[4763]: E0930 14:24:43.490195 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:24:56 crc kubenswrapper[4763]: I0930 14:24:56.489642 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:24:56 crc kubenswrapper[4763]: E0930 14:24:56.490324 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.141306 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-snkl4"] Sep 30 14:25:07 crc kubenswrapper[4763]: E0930 14:25:07.142186 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerName="registry-server" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.142201 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerName="registry-server" Sep 30 14:25:07 crc kubenswrapper[4763]: E0930 14:25:07.142224 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerName="extract-utilities" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.142234 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerName="extract-utilities" Sep 30 14:25:07 crc kubenswrapper[4763]: E0930 14:25:07.142267 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerName="extract-content" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.142274 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerName="extract-content" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.142433 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfee86c1-486b-4e4e-ac0d-5b1025238dca" containerName="registry-server" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.143758 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.165307 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-snkl4"] Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.315745 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-utilities\") pod \"certified-operators-snkl4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.315867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-catalog-content\") pod \"certified-operators-snkl4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.315990 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snjz\" (UniqueName: \"kubernetes.io/projected/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-kube-api-access-7snjz\") pod \"certified-operators-snkl4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.417404 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snjz\" (UniqueName: \"kubernetes.io/projected/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-kube-api-access-7snjz\") pod \"certified-operators-snkl4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.417509 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-utilities\") pod \"certified-operators-snkl4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.417557 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-catalog-content\") pod \"certified-operators-snkl4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.418036 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-utilities\") pod \"certified-operators-snkl4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.418081 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-catalog-content\") pod \"certified-operators-snkl4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.442129 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snjz\" (UniqueName: \"kubernetes.io/projected/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-kube-api-access-7snjz\") pod \"certified-operators-snkl4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.474847 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.489929 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:25:07 crc kubenswrapper[4763]: E0930 14:25:07.490111 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:25:07 crc kubenswrapper[4763]: I0930 14:25:07.990988 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-snkl4"] Sep 30 14:25:08 crc kubenswrapper[4763]: I0930 14:25:08.387411 4763 generic.go:334] "Generic (PLEG): container finished" podID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerID="4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22" exitCode=0 Sep 30 14:25:08 crc kubenswrapper[4763]: I0930 14:25:08.387490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snkl4" event={"ID":"e19e8e49-035e-4d7f-9b38-842a77fb6fe4","Type":"ContainerDied","Data":"4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22"} Sep 30 14:25:08 crc kubenswrapper[4763]: I0930 14:25:08.387927 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snkl4" event={"ID":"e19e8e49-035e-4d7f-9b38-842a77fb6fe4","Type":"ContainerStarted","Data":"7f7f715576479788949cbb2f725cc737043c13bc669dd5e249d024b240026fbb"} Sep 30 14:25:08 crc kubenswrapper[4763]: I0930 14:25:08.389774 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:25:09 crc kubenswrapper[4763]: I0930 14:25:09.400671 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snkl4" event={"ID":"e19e8e49-035e-4d7f-9b38-842a77fb6fe4","Type":"ContainerStarted","Data":"0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e"} Sep 30 14:25:10 crc kubenswrapper[4763]: I0930 14:25:10.415892 4763 generic.go:334] "Generic (PLEG): container finished" podID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerID="0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e" exitCode=0 Sep 30 14:25:10 crc kubenswrapper[4763]: I0930 14:25:10.415958 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snkl4" event={"ID":"e19e8e49-035e-4d7f-9b38-842a77fb6fe4","Type":"ContainerDied","Data":"0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e"} Sep 30 14:25:11 crc kubenswrapper[4763]: I0930 14:25:11.426032 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snkl4" event={"ID":"e19e8e49-035e-4d7f-9b38-842a77fb6fe4","Type":"ContainerStarted","Data":"b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe"} Sep 30 14:25:11 crc kubenswrapper[4763]: I0930 14:25:11.446831 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-snkl4" podStartSLOduration=1.916487337 podStartE2EDuration="4.446809981s" podCreationTimestamp="2025-09-30 14:25:07 +0000 UTC" firstStartedPulling="2025-09-30 14:25:08.389347466 +0000 UTC m=+2980.527907751" lastFinishedPulling="2025-09-30 14:25:10.91967011 +0000 UTC m=+2983.058230395" observedRunningTime="2025-09-30 14:25:11.444439321 +0000 UTC m=+2983.582999616" watchObservedRunningTime="2025-09-30 14:25:11.446809981 +0000 UTC m=+2983.585370266" Sep 30 14:25:17 crc kubenswrapper[4763]: I0930 14:25:17.476012 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:17 crc kubenswrapper[4763]: I0930 14:25:17.476750 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:17 crc kubenswrapper[4763]: I0930 14:25:17.553505 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:18 crc kubenswrapper[4763]: I0930 14:25:18.530032 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:18 crc kubenswrapper[4763]: I0930 14:25:18.586268 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-snkl4"] Sep 30 14:25:20 crc kubenswrapper[4763]: I0930 14:25:20.490293 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:25:20 crc kubenswrapper[4763]: E0930 14:25:20.490548 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:25:20 crc kubenswrapper[4763]: I0930 14:25:20.498067 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-snkl4" podUID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerName="registry-server" containerID="cri-o://b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe" gracePeriod=2 Sep 30 14:25:20 crc kubenswrapper[4763]: I0930 14:25:20.869593 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.014833 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7snjz\" (UniqueName: \"kubernetes.io/projected/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-kube-api-access-7snjz\") pod \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.016170 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-utilities\") pod \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.016228 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-catalog-content\") pod \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\" (UID: \"e19e8e49-035e-4d7f-9b38-842a77fb6fe4\") " Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.017755 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-utilities" (OuterVolumeSpecName: "utilities") pod "e19e8e49-035e-4d7f-9b38-842a77fb6fe4" (UID: "e19e8e49-035e-4d7f-9b38-842a77fb6fe4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.020801 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-kube-api-access-7snjz" (OuterVolumeSpecName: "kube-api-access-7snjz") pod "e19e8e49-035e-4d7f-9b38-842a77fb6fe4" (UID: "e19e8e49-035e-4d7f-9b38-842a77fb6fe4"). InnerVolumeSpecName "kube-api-access-7snjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.064062 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e19e8e49-035e-4d7f-9b38-842a77fb6fe4" (UID: "e19e8e49-035e-4d7f-9b38-842a77fb6fe4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.117916 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7snjz\" (UniqueName: \"kubernetes.io/projected/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-kube-api-access-7snjz\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.118181 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.118747 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e19e8e49-035e-4d7f-9b38-842a77fb6fe4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.508584 4763 generic.go:334] "Generic (PLEG): container finished" podID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerID="b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe" exitCode=0 Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.508667 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snkl4" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.509318 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snkl4" event={"ID":"e19e8e49-035e-4d7f-9b38-842a77fb6fe4","Type":"ContainerDied","Data":"b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe"} Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.509386 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snkl4" event={"ID":"e19e8e49-035e-4d7f-9b38-842a77fb6fe4","Type":"ContainerDied","Data":"7f7f715576479788949cbb2f725cc737043c13bc669dd5e249d024b240026fbb"} Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.509417 4763 scope.go:117] "RemoveContainer" containerID="b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.527280 4763 scope.go:117] "RemoveContainer" containerID="0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.542196 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-snkl4"] Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.547590 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-snkl4"] Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.568931 4763 scope.go:117] "RemoveContainer" containerID="4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.584950 4763 scope.go:117] "RemoveContainer" containerID="b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe" Sep 30 14:25:21 crc kubenswrapper[4763]: E0930 14:25:21.585380 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe\": container with ID starting with b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe not found: ID does not exist" containerID="b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.585409 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe"} err="failed to get container status \"b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe\": rpc error: code = NotFound desc = could not find container \"b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe\": container with ID starting with b0397ca712a1858d0faaf5db0f6ab477d07953ae5cf017776111137789271ebe not found: ID does not exist" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.585436 4763 scope.go:117] "RemoveContainer" containerID="0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e" Sep 30 14:25:21 crc kubenswrapper[4763]: E0930 14:25:21.585928 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e\": container with ID starting with 0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e not found: ID does not exist" containerID="0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.585976 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e"} err="failed to get container status \"0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e\": rpc error: code = NotFound desc = could not find container \"0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e\": container with ID starting with 0e4c59b56f06ed8e2afb48665e80d87224bd89b2165844165b85c334853d306e not found: ID does not exist" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.586016 4763 scope.go:117] "RemoveContainer" containerID="4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22" Sep 30 14:25:21 crc kubenswrapper[4763]: E0930 14:25:21.586375 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22\": container with ID starting with 4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22 not found: ID does not exist" containerID="4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22" Sep 30 14:25:21 crc kubenswrapper[4763]: I0930 14:25:21.586404 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22"} err="failed to get container status \"4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22\": rpc error: code = NotFound desc = could not find container \"4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22\": container with ID starting with 4aa0d0bdf69927008569af6dc32499f3ce861d47cfbcebcd7c5e91a933718a22 not found: ID does not exist" Sep 30 14:25:22 crc kubenswrapper[4763]: I0930 14:25:22.501772 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" path="/var/lib/kubelet/pods/e19e8e49-035e-4d7f-9b38-842a77fb6fe4/volumes" Sep 30 14:25:32 crc kubenswrapper[4763]: I0930 14:25:32.489451 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:25:32 crc kubenswrapper[4763]: E0930 14:25:32.491151 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:25:44 crc kubenswrapper[4763]: I0930 14:25:44.488998 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:25:44 crc kubenswrapper[4763]: E0930 14:25:44.489768 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:25:57 crc kubenswrapper[4763]: I0930 14:25:57.488937 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:25:57 crc kubenswrapper[4763]: E0930 14:25:57.489650 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:26:09 crc kubenswrapper[4763]: I0930 14:26:09.489485 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:26:09 crc kubenswrapper[4763]: E0930 14:26:09.490217 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:26:20 crc kubenswrapper[4763]: I0930 14:26:20.490457 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:26:20 crc kubenswrapper[4763]: E0930 14:26:20.491393 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:26:34 crc kubenswrapper[4763]: I0930 14:26:34.494260 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:26:34 crc kubenswrapper[4763]: E0930 14:26:34.495127 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:26:46 crc kubenswrapper[4763]: I0930 14:26:46.489750 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:26:46 crc kubenswrapper[4763]: E0930 14:26:46.490489 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:26:57 crc kubenswrapper[4763]: I0930 14:26:57.489520 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:26:57 crc kubenswrapper[4763]: E0930 14:26:57.492148 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:27:12 crc kubenswrapper[4763]: I0930 14:27:12.489959 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:27:12 crc kubenswrapper[4763]: E0930 14:27:12.492572 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:27:23 crc kubenswrapper[4763]: I0930 14:27:23.489358 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:27:23 crc kubenswrapper[4763]: E0930 14:27:23.490511 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:27:37 crc kubenswrapper[4763]: I0930 14:27:37.489300 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:27:37 crc kubenswrapper[4763]: E0930 14:27:37.491578 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:27:50 crc kubenswrapper[4763]: I0930 14:27:50.489444 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:27:50 crc kubenswrapper[4763]: E0930 14:27:50.490396 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:28:04 crc kubenswrapper[4763]: I0930 14:28:04.489153 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:28:04 crc kubenswrapper[4763]: E0930 14:28:04.489789 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:28:15 crc kubenswrapper[4763]: I0930 14:28:15.490018 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:28:15 crc kubenswrapper[4763]: E0930 14:28:15.490767 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:28:28 crc kubenswrapper[4763]: I0930 14:28:28.492918 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:28:28 crc kubenswrapper[4763]: E0930 14:28:28.493661 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:28:43 crc kubenswrapper[4763]: I0930 14:28:43.489524 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:28:44 crc kubenswrapper[4763]: I0930 14:28:44.040058 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"97ba18998f1cc4e89bcb2eed150426d5971671415b0b392d0f0a8a0ee3ef6eb0"} Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.194260 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc"] Sep 30 14:30:00 crc kubenswrapper[4763]: E0930 14:30:00.196332 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerName="extract-content" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.196350 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerName="extract-content" Sep 30 14:30:00 crc kubenswrapper[4763]: E0930 14:30:00.196386 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerName="registry-server" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.196400 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerName="registry-server" Sep 30 14:30:00 crc kubenswrapper[4763]: E0930 14:30:00.196416 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerName="extract-utilities" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.196426 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerName="extract-utilities" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.196564 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19e8e49-035e-4d7f-9b38-842a77fb6fe4" containerName="registry-server" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.197240 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.201998 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.205188 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.214965 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc"] Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.333223 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74fac9a6-d13e-48f9-a502-2026c9d71525-secret-volume\") pod \"collect-profiles-29320710-5zlpc\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.333338 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h45v8\" (UniqueName: \"kubernetes.io/projected/74fac9a6-d13e-48f9-a502-2026c9d71525-kube-api-access-h45v8\") pod \"collect-profiles-29320710-5zlpc\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.333420 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74fac9a6-d13e-48f9-a502-2026c9d71525-config-volume\") pod \"collect-profiles-29320710-5zlpc\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.434907 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74fac9a6-d13e-48f9-a502-2026c9d71525-config-volume\") pod \"collect-profiles-29320710-5zlpc\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.435131 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74fac9a6-d13e-48f9-a502-2026c9d71525-secret-volume\") pod \"collect-profiles-29320710-5zlpc\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.435237 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h45v8\" (UniqueName: \"kubernetes.io/projected/74fac9a6-d13e-48f9-a502-2026c9d71525-kube-api-access-h45v8\") pod \"collect-profiles-29320710-5zlpc\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.436740 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74fac9a6-d13e-48f9-a502-2026c9d71525-config-volume\") pod \"collect-profiles-29320710-5zlpc\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.447174 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74fac9a6-d13e-48f9-a502-2026c9d71525-secret-volume\") pod \"collect-profiles-29320710-5zlpc\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.456838 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h45v8\" (UniqueName: \"kubernetes.io/projected/74fac9a6-d13e-48f9-a502-2026c9d71525-kube-api-access-h45v8\") pod \"collect-profiles-29320710-5zlpc\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.520894 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:00 crc kubenswrapper[4763]: I0930 14:30:00.937208 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc"] Sep 30 14:30:01 crc kubenswrapper[4763]: I0930 14:30:01.612696 4763 generic.go:334] "Generic (PLEG): container finished" podID="74fac9a6-d13e-48f9-a502-2026c9d71525" containerID="53315db663bab298ee6ab16fa5c5ec759a8e31781865a029f942c8cf381b077b" exitCode=0 Sep 30 14:30:01 crc kubenswrapper[4763]: I0930 14:30:01.612800 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" event={"ID":"74fac9a6-d13e-48f9-a502-2026c9d71525","Type":"ContainerDied","Data":"53315db663bab298ee6ab16fa5c5ec759a8e31781865a029f942c8cf381b077b"} Sep 30 14:30:01 crc kubenswrapper[4763]: I0930 14:30:01.613040 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" event={"ID":"74fac9a6-d13e-48f9-a502-2026c9d71525","Type":"ContainerStarted","Data":"0399f7e7fd03c66b880ae25a2595ff43ffaed5678bf4d2cfd1ef6f2f62deb792"} Sep 30 14:30:02 crc kubenswrapper[4763]: I0930 14:30:02.877485 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:02 crc kubenswrapper[4763]: I0930 14:30:02.970081 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74fac9a6-d13e-48f9-a502-2026c9d71525-secret-volume\") pod \"74fac9a6-d13e-48f9-a502-2026c9d71525\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " Sep 30 14:30:02 crc kubenswrapper[4763]: I0930 14:30:02.970256 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74fac9a6-d13e-48f9-a502-2026c9d71525-config-volume\") pod \"74fac9a6-d13e-48f9-a502-2026c9d71525\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " Sep 30 14:30:02 crc kubenswrapper[4763]: I0930 14:30:02.970283 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h45v8\" (UniqueName: \"kubernetes.io/projected/74fac9a6-d13e-48f9-a502-2026c9d71525-kube-api-access-h45v8\") pod \"74fac9a6-d13e-48f9-a502-2026c9d71525\" (UID: \"74fac9a6-d13e-48f9-a502-2026c9d71525\") " Sep 30 14:30:02 crc kubenswrapper[4763]: I0930 14:30:02.972377 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74fac9a6-d13e-48f9-a502-2026c9d71525-config-volume" (OuterVolumeSpecName: "config-volume") pod "74fac9a6-d13e-48f9-a502-2026c9d71525" (UID: "74fac9a6-d13e-48f9-a502-2026c9d71525"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:30:02 crc kubenswrapper[4763]: I0930 14:30:02.977208 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74fac9a6-d13e-48f9-a502-2026c9d71525-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "74fac9a6-d13e-48f9-a502-2026c9d71525" (UID: "74fac9a6-d13e-48f9-a502-2026c9d71525"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:30:02 crc kubenswrapper[4763]: I0930 14:30:02.977259 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74fac9a6-d13e-48f9-a502-2026c9d71525-kube-api-access-h45v8" (OuterVolumeSpecName: "kube-api-access-h45v8") pod "74fac9a6-d13e-48f9-a502-2026c9d71525" (UID: "74fac9a6-d13e-48f9-a502-2026c9d71525"). InnerVolumeSpecName "kube-api-access-h45v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:30:03 crc kubenswrapper[4763]: I0930 14:30:03.071784 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74fac9a6-d13e-48f9-a502-2026c9d71525-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:03 crc kubenswrapper[4763]: I0930 14:30:03.071831 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h45v8\" (UniqueName: \"kubernetes.io/projected/74fac9a6-d13e-48f9-a502-2026c9d71525-kube-api-access-h45v8\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:03 crc kubenswrapper[4763]: I0930 14:30:03.071846 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74fac9a6-d13e-48f9-a502-2026c9d71525-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:03 crc kubenswrapper[4763]: I0930 14:30:03.629005 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" event={"ID":"74fac9a6-d13e-48f9-a502-2026c9d71525","Type":"ContainerDied","Data":"0399f7e7fd03c66b880ae25a2595ff43ffaed5678bf4d2cfd1ef6f2f62deb792"} Sep 30 14:30:03 crc kubenswrapper[4763]: I0930 14:30:03.629242 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0399f7e7fd03c66b880ae25a2595ff43ffaed5678bf4d2cfd1ef6f2f62deb792" Sep 30 14:30:03 crc kubenswrapper[4763]: I0930 14:30:03.629259 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc" Sep 30 14:30:03 crc kubenswrapper[4763]: I0930 14:30:03.945740 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75"] Sep 30 14:30:03 crc kubenswrapper[4763]: I0930 14:30:03.950953 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-cwm75"] Sep 30 14:30:04 crc kubenswrapper[4763]: I0930 14:30:04.502887 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ca1ac89-3e97-43f0-a8a1-4b9dd101887d" path="/var/lib/kubelet/pods/6ca1ac89-3e97-43f0-a8a1-4b9dd101887d/volumes" Sep 30 14:30:33 crc kubenswrapper[4763]: I0930 14:30:33.204701 4763 scope.go:117] "RemoveContainer" containerID="1b53fb21c0f15bb62c3da5c0ff83299537f6c2a687b868630fb8ca957e6df5ab" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.464272 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mxcx4"] Sep 30 14:30:41 crc kubenswrapper[4763]: E0930 14:30:41.465175 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74fac9a6-d13e-48f9-a502-2026c9d71525" containerName="collect-profiles" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.465193 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="74fac9a6-d13e-48f9-a502-2026c9d71525" containerName="collect-profiles" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.465365 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="74fac9a6-d13e-48f9-a502-2026c9d71525" containerName="collect-profiles" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.468293 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.475758 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxcx4"] Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.515565 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-catalog-content\") pod \"redhat-marketplace-mxcx4\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.517769 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-utilities\") pod \"redhat-marketplace-mxcx4\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.531989 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9gq\" (UniqueName: \"kubernetes.io/projected/cb10a6f5-a464-4111-9044-26fe14bcb4cd-kube-api-access-ch9gq\") pod \"redhat-marketplace-mxcx4\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.633911 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-utilities\") pod \"redhat-marketplace-mxcx4\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.633991 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9gq\" (UniqueName: \"kubernetes.io/projected/cb10a6f5-a464-4111-9044-26fe14bcb4cd-kube-api-access-ch9gq\") pod \"redhat-marketplace-mxcx4\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.634068 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-catalog-content\") pod \"redhat-marketplace-mxcx4\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.634498 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-utilities\") pod \"redhat-marketplace-mxcx4\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.634526 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-catalog-content\") pod \"redhat-marketplace-mxcx4\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.655904 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9gq\" (UniqueName: \"kubernetes.io/projected/cb10a6f5-a464-4111-9044-26fe14bcb4cd-kube-api-access-ch9gq\") pod \"redhat-marketplace-mxcx4\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:41 crc kubenswrapper[4763]: I0930 14:30:41.793773 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:42 crc kubenswrapper[4763]: I0930 14:30:42.204851 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxcx4"] Sep 30 14:30:42 crc kubenswrapper[4763]: I0930 14:30:42.898767 4763 generic.go:334] "Generic (PLEG): container finished" podID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerID="37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a" exitCode=0 Sep 30 14:30:42 crc kubenswrapper[4763]: I0930 14:30:42.898814 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxcx4" event={"ID":"cb10a6f5-a464-4111-9044-26fe14bcb4cd","Type":"ContainerDied","Data":"37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a"} Sep 30 14:30:42 crc kubenswrapper[4763]: I0930 14:30:42.898842 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxcx4" event={"ID":"cb10a6f5-a464-4111-9044-26fe14bcb4cd","Type":"ContainerStarted","Data":"8c122596d1cf58d8a52580cd4ae16741049a522f3e44a8031696bd6720b36e5f"} Sep 30 14:30:42 crc kubenswrapper[4763]: I0930 14:30:42.901403 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:30:43 crc kubenswrapper[4763]: I0930 14:30:43.914981 4763 generic.go:334] "Generic (PLEG): container finished" podID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerID="eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253" exitCode=0 Sep 30 14:30:43 crc kubenswrapper[4763]: I0930 14:30:43.915049 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxcx4" event={"ID":"cb10a6f5-a464-4111-9044-26fe14bcb4cd","Type":"ContainerDied","Data":"eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253"} Sep 30 14:30:44 crc kubenswrapper[4763]: I0930 14:30:44.924342 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxcx4" event={"ID":"cb10a6f5-a464-4111-9044-26fe14bcb4cd","Type":"ContainerStarted","Data":"77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2"} Sep 30 14:30:44 crc kubenswrapper[4763]: I0930 14:30:44.939784 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mxcx4" podStartSLOduration=2.536857729 podStartE2EDuration="3.939752418s" podCreationTimestamp="2025-09-30 14:30:41 +0000 UTC" firstStartedPulling="2025-09-30 14:30:42.901152883 +0000 UTC m=+3315.039713168" lastFinishedPulling="2025-09-30 14:30:44.304047572 +0000 UTC m=+3316.442607857" observedRunningTime="2025-09-30 14:30:44.938328193 +0000 UTC m=+3317.076888498" watchObservedRunningTime="2025-09-30 14:30:44.939752418 +0000 UTC m=+3317.078312703" Sep 30 14:30:51 crc kubenswrapper[4763]: I0930 14:30:51.794770 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:51 crc kubenswrapper[4763]: I0930 14:30:51.795403 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:51 crc kubenswrapper[4763]: I0930 14:30:51.847320 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:52 crc kubenswrapper[4763]: I0930 14:30:52.018706 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:52 crc kubenswrapper[4763]: I0930 14:30:52.081298 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxcx4"] Sep 30 14:30:53 crc kubenswrapper[4763]: I0930 14:30:53.989121 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mxcx4" podUID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerName="registry-server" containerID="cri-o://77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2" gracePeriod=2 Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.385431 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.416759 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-utilities\") pod \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.416988 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-catalog-content\") pod \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.417012 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch9gq\" (UniqueName: \"kubernetes.io/projected/cb10a6f5-a464-4111-9044-26fe14bcb4cd-kube-api-access-ch9gq\") pod \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\" (UID: \"cb10a6f5-a464-4111-9044-26fe14bcb4cd\") " Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.417997 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-utilities" (OuterVolumeSpecName: "utilities") pod "cb10a6f5-a464-4111-9044-26fe14bcb4cd" (UID: "cb10a6f5-a464-4111-9044-26fe14bcb4cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.422788 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb10a6f5-a464-4111-9044-26fe14bcb4cd-kube-api-access-ch9gq" (OuterVolumeSpecName: "kube-api-access-ch9gq") pod "cb10a6f5-a464-4111-9044-26fe14bcb4cd" (UID: "cb10a6f5-a464-4111-9044-26fe14bcb4cd"). InnerVolumeSpecName "kube-api-access-ch9gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.432934 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb10a6f5-a464-4111-9044-26fe14bcb4cd" (UID: "cb10a6f5-a464-4111-9044-26fe14bcb4cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.521359 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.521412 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch9gq\" (UniqueName: \"kubernetes.io/projected/cb10a6f5-a464-4111-9044-26fe14bcb4cd-kube-api-access-ch9gq\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.521427 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb10a6f5-a464-4111-9044-26fe14bcb4cd-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.999671 4763 generic.go:334] "Generic (PLEG): container finished" podID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerID="77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2" exitCode=0 Sep 30 14:30:54 crc kubenswrapper[4763]: I0930 14:30:54.999727 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxcx4" event={"ID":"cb10a6f5-a464-4111-9044-26fe14bcb4cd","Type":"ContainerDied","Data":"77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2"} Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:54.999762 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxcx4" event={"ID":"cb10a6f5-a464-4111-9044-26fe14bcb4cd","Type":"ContainerDied","Data":"8c122596d1cf58d8a52580cd4ae16741049a522f3e44a8031696bd6720b36e5f"} Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:54.999798 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxcx4" Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:54.999796 4763 scope.go:117] "RemoveContainer" containerID="77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2" Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:55.025242 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxcx4"] Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:55.025548 4763 scope.go:117] "RemoveContainer" containerID="eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253" Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:55.038112 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxcx4"] Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:55.050325 4763 scope.go:117] "RemoveContainer" containerID="37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a" Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:55.085079 4763 scope.go:117] "RemoveContainer" containerID="77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2" Sep 30 14:30:55 crc kubenswrapper[4763]: E0930 14:30:55.085620 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2\": container with ID starting with 77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2 not found: ID does not exist" containerID="77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2" Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:55.085658 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2"} err="failed to get container status \"77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2\": rpc error: code = NotFound desc = could not find container \"77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2\": container with ID starting with 77157eb9ac4e08d00f0286511762a8718833fe6dff2fb8cad2b88dba5505aab2 not found: ID does not exist" Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:55.085686 4763 scope.go:117] "RemoveContainer" containerID="eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253" Sep 30 14:30:55 crc kubenswrapper[4763]: E0930 14:30:55.085922 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253\": container with ID starting with eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253 not found: ID does not exist" containerID="eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253" Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:55.085945 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253"} err="failed to get container status \"eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253\": rpc error: code = NotFound desc = could not find container \"eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253\": container with ID starting with eb3dc6b3a316faa42c0ef1cfc81bceb4b847211235941a3358f20130871ae253 not found: ID does not exist" Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:55.085965 4763 scope.go:117] "RemoveContainer" containerID="37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a" Sep 30 14:30:55 crc kubenswrapper[4763]: E0930 14:30:55.086170 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a\": container with ID starting with 37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a not found: ID does not exist" containerID="37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a" Sep 30 14:30:55 crc kubenswrapper[4763]: I0930 14:30:55.086193 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a"} err="failed to get container status \"37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a\": rpc error: code = NotFound desc = could not find container \"37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a\": container with ID starting with 37469f6ef6f27ee6b921d6118236c7b68e2b8510ac889f12e5ceec95fa6f892a not found: ID does not exist" Sep 30 14:30:56 crc kubenswrapper[4763]: I0930 14:30:56.499209 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" path="/var/lib/kubelet/pods/cb10a6f5-a464-4111-9044-26fe14bcb4cd/volumes" Sep 30 14:31:06 crc kubenswrapper[4763]: I0930 14:31:06.059772 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:31:06 crc kubenswrapper[4763]: I0930 14:31:06.060363 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:31:36 crc kubenswrapper[4763]: I0930 14:31:36.060322 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:31:36 crc kubenswrapper[4763]: I0930 14:31:36.060834 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:32:06 crc kubenswrapper[4763]: I0930 14:32:06.059682 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:32:06 crc kubenswrapper[4763]: I0930 14:32:06.060197 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:32:06 crc kubenswrapper[4763]: I0930 14:32:06.060253 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:32:06 crc kubenswrapper[4763]: I0930 14:32:06.060944 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97ba18998f1cc4e89bcb2eed150426d5971671415b0b392d0f0a8a0ee3ef6eb0"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:32:06 crc kubenswrapper[4763]: I0930 14:32:06.061013 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://97ba18998f1cc4e89bcb2eed150426d5971671415b0b392d0f0a8a0ee3ef6eb0" gracePeriod=600 Sep 30 14:32:06 crc kubenswrapper[4763]: I0930 14:32:06.515691 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="97ba18998f1cc4e89bcb2eed150426d5971671415b0b392d0f0a8a0ee3ef6eb0" exitCode=0 Sep 30 14:32:06 crc kubenswrapper[4763]: I0930 14:32:06.515825 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"97ba18998f1cc4e89bcb2eed150426d5971671415b0b392d0f0a8a0ee3ef6eb0"} Sep 30 14:32:06 crc kubenswrapper[4763]: I0930 14:32:06.516051 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c"} Sep 30 14:32:06 crc kubenswrapper[4763]: I0930 14:32:06.516080 4763 scope.go:117] "RemoveContainer" containerID="9bc781351f1faffe7a3d3fdc71cc447636ec74c1567d394b336a2cd2ac3d222f" Sep 30 14:33:52 crc kubenswrapper[4763]: I0930 14:33:52.923353 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4mfkv"] Sep 30 14:33:52 crc kubenswrapper[4763]: E0930 14:33:52.924232 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerName="extract-utilities" Sep 30 14:33:52 crc kubenswrapper[4763]: I0930 14:33:52.924250 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerName="extract-utilities" Sep 30 14:33:52 crc kubenswrapper[4763]: E0930 14:33:52.924260 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerName="registry-server" Sep 30 14:33:52 crc kubenswrapper[4763]: I0930 14:33:52.924266 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerName="registry-server" Sep 30 14:33:52 crc kubenswrapper[4763]: E0930 14:33:52.924279 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerName="extract-content" Sep 30 14:33:52 crc kubenswrapper[4763]: I0930 14:33:52.924284 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerName="extract-content" Sep 30 14:33:52 crc kubenswrapper[4763]: I0930 14:33:52.924446 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb10a6f5-a464-4111-9044-26fe14bcb4cd" containerName="registry-server" Sep 30 14:33:52 crc kubenswrapper[4763]: I0930 14:33:52.925664 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:52 crc kubenswrapper[4763]: I0930 14:33:52.929339 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4mfkv"] Sep 30 14:33:52 crc kubenswrapper[4763]: I0930 14:33:52.966475 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-utilities\") pod \"redhat-operators-4mfkv\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:52 crc kubenswrapper[4763]: I0930 14:33:52.966829 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbjf5\" (UniqueName: \"kubernetes.io/projected/42e30c51-db28-4445-80a2-50a9cb578aba-kube-api-access-hbjf5\") pod \"redhat-operators-4mfkv\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:52 crc kubenswrapper[4763]: I0930 14:33:52.966992 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-catalog-content\") pod \"redhat-operators-4mfkv\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:53 crc kubenswrapper[4763]: I0930 14:33:53.067948 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-utilities\") pod \"redhat-operators-4mfkv\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:53 crc kubenswrapper[4763]: I0930 14:33:53.068030 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbjf5\" (UniqueName: \"kubernetes.io/projected/42e30c51-db28-4445-80a2-50a9cb578aba-kube-api-access-hbjf5\") pod \"redhat-operators-4mfkv\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:53 crc kubenswrapper[4763]: I0930 14:33:53.068079 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-catalog-content\") pod \"redhat-operators-4mfkv\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:53 crc kubenswrapper[4763]: I0930 14:33:53.068477 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-utilities\") pod \"redhat-operators-4mfkv\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:53 crc kubenswrapper[4763]: I0930 14:33:53.068526 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-catalog-content\") pod \"redhat-operators-4mfkv\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:53 crc kubenswrapper[4763]: I0930 14:33:53.089949 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbjf5\" (UniqueName: \"kubernetes.io/projected/42e30c51-db28-4445-80a2-50a9cb578aba-kube-api-access-hbjf5\") pod \"redhat-operators-4mfkv\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:53 crc kubenswrapper[4763]: I0930 14:33:53.248534 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:33:53 crc kubenswrapper[4763]: I0930 14:33:53.482675 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4mfkv"] Sep 30 14:33:54 crc kubenswrapper[4763]: I0930 14:33:54.281788 4763 generic.go:334] "Generic (PLEG): container finished" podID="42e30c51-db28-4445-80a2-50a9cb578aba" containerID="b9c40a0c94f01b80f954c5e2d4d9e146f015e3a541b5faf6a8e54d39775535b7" exitCode=0 Sep 30 14:33:54 crc kubenswrapper[4763]: I0930 14:33:54.281841 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mfkv" event={"ID":"42e30c51-db28-4445-80a2-50a9cb578aba","Type":"ContainerDied","Data":"b9c40a0c94f01b80f954c5e2d4d9e146f015e3a541b5faf6a8e54d39775535b7"} Sep 30 14:33:54 crc kubenswrapper[4763]: I0930 14:33:54.282372 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mfkv" event={"ID":"42e30c51-db28-4445-80a2-50a9cb578aba","Type":"ContainerStarted","Data":"996805aa68229f6484df5157f1c95b46eb939d94e0bd6925db08aec0f69b882d"} Sep 30 14:33:55 crc kubenswrapper[4763]: I0930 14:33:55.292789 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mfkv" event={"ID":"42e30c51-db28-4445-80a2-50a9cb578aba","Type":"ContainerStarted","Data":"69cb6d2a189419e4c917e49fdca25204a970e00ee7b505c5154d5a7cc81e5de9"} Sep 30 14:33:56 crc kubenswrapper[4763]: I0930 14:33:56.303417 4763 generic.go:334] "Generic (PLEG): container finished" podID="42e30c51-db28-4445-80a2-50a9cb578aba" containerID="69cb6d2a189419e4c917e49fdca25204a970e00ee7b505c5154d5a7cc81e5de9" exitCode=0 Sep 30 14:33:56 crc kubenswrapper[4763]: I0930 14:33:56.303467 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mfkv" event={"ID":"42e30c51-db28-4445-80a2-50a9cb578aba","Type":"ContainerDied","Data":"69cb6d2a189419e4c917e49fdca25204a970e00ee7b505c5154d5a7cc81e5de9"} Sep 30 14:33:57 crc kubenswrapper[4763]: I0930 14:33:57.313535 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mfkv" event={"ID":"42e30c51-db28-4445-80a2-50a9cb578aba","Type":"ContainerStarted","Data":"833bb2d3ff28e6cd3ecf5313ba7cf739cfc6367ec41e99044379cb4702ff11a8"} Sep 30 14:33:57 crc kubenswrapper[4763]: I0930 14:33:57.357507 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4mfkv" podStartSLOduration=2.886068126 podStartE2EDuration="5.357484613s" podCreationTimestamp="2025-09-30 14:33:52 +0000 UTC" firstStartedPulling="2025-09-30 14:33:54.284033539 +0000 UTC m=+3506.422593824" lastFinishedPulling="2025-09-30 14:33:56.755450026 +0000 UTC m=+3508.894010311" observedRunningTime="2025-09-30 14:33:57.350797098 +0000 UTC m=+3509.489357393" watchObservedRunningTime="2025-09-30 14:33:57.357484613 +0000 UTC m=+3509.496044898" Sep 30 14:34:03 crc kubenswrapper[4763]: I0930 14:34:03.249672 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:34:03 crc kubenswrapper[4763]: I0930 14:34:03.250203 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:34:03 crc kubenswrapper[4763]: I0930 14:34:03.290511 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:34:03 crc kubenswrapper[4763]: I0930 14:34:03.399185 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:34:03 crc kubenswrapper[4763]: I0930 14:34:03.522212 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4mfkv"] Sep 30 14:34:05 crc kubenswrapper[4763]: I0930 14:34:05.372714 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4mfkv" podUID="42e30c51-db28-4445-80a2-50a9cb578aba" containerName="registry-server" containerID="cri-o://833bb2d3ff28e6cd3ecf5313ba7cf739cfc6367ec41e99044379cb4702ff11a8" gracePeriod=2 Sep 30 14:34:06 crc kubenswrapper[4763]: I0930 14:34:06.059554 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:34:06 crc kubenswrapper[4763]: I0930 14:34:06.059638 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:34:06 crc kubenswrapper[4763]: I0930 14:34:06.380739 4763 generic.go:334] "Generic (PLEG): container finished" podID="42e30c51-db28-4445-80a2-50a9cb578aba" containerID="833bb2d3ff28e6cd3ecf5313ba7cf739cfc6367ec41e99044379cb4702ff11a8" exitCode=0 Sep 30 14:34:06 crc kubenswrapper[4763]: I0930 14:34:06.380778 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mfkv" event={"ID":"42e30c51-db28-4445-80a2-50a9cb578aba","Type":"ContainerDied","Data":"833bb2d3ff28e6cd3ecf5313ba7cf739cfc6367ec41e99044379cb4702ff11a8"} Sep 30 14:34:06 crc kubenswrapper[4763]: I0930 14:34:06.830985 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:34:06 crc kubenswrapper[4763]: I0930 14:34:06.962514 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-catalog-content\") pod \"42e30c51-db28-4445-80a2-50a9cb578aba\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " Sep 30 14:34:06 crc kubenswrapper[4763]: I0930 14:34:06.962612 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-utilities\") pod \"42e30c51-db28-4445-80a2-50a9cb578aba\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " Sep 30 14:34:06 crc kubenswrapper[4763]: I0930 14:34:06.962700 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbjf5\" (UniqueName: \"kubernetes.io/projected/42e30c51-db28-4445-80a2-50a9cb578aba-kube-api-access-hbjf5\") pod \"42e30c51-db28-4445-80a2-50a9cb578aba\" (UID: \"42e30c51-db28-4445-80a2-50a9cb578aba\") " Sep 30 14:34:06 crc kubenswrapper[4763]: I0930 14:34:06.963492 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-utilities" (OuterVolumeSpecName: "utilities") pod "42e30c51-db28-4445-80a2-50a9cb578aba" (UID: "42e30c51-db28-4445-80a2-50a9cb578aba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4763]: I0930 14:34:06.970894 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e30c51-db28-4445-80a2-50a9cb578aba-kube-api-access-hbjf5" (OuterVolumeSpecName: "kube-api-access-hbjf5") pod "42e30c51-db28-4445-80a2-50a9cb578aba" (UID: "42e30c51-db28-4445-80a2-50a9cb578aba"). InnerVolumeSpecName "kube-api-access-hbjf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.055062 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42e30c51-db28-4445-80a2-50a9cb578aba" (UID: "42e30c51-db28-4445-80a2-50a9cb578aba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.064314 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.064481 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e30c51-db28-4445-80a2-50a9cb578aba-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.064541 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbjf5\" (UniqueName: \"kubernetes.io/projected/42e30c51-db28-4445-80a2-50a9cb578aba-kube-api-access-hbjf5\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.390170 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mfkv" event={"ID":"42e30c51-db28-4445-80a2-50a9cb578aba","Type":"ContainerDied","Data":"996805aa68229f6484df5157f1c95b46eb939d94e0bd6925db08aec0f69b882d"} Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.390240 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mfkv" Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.390289 4763 scope.go:117] "RemoveContainer" containerID="833bb2d3ff28e6cd3ecf5313ba7cf739cfc6367ec41e99044379cb4702ff11a8" Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.414724 4763 scope.go:117] "RemoveContainer" containerID="69cb6d2a189419e4c917e49fdca25204a970e00ee7b505c5154d5a7cc81e5de9" Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.416747 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4mfkv"] Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.421830 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4mfkv"] Sep 30 14:34:07 crc kubenswrapper[4763]: I0930 14:34:07.453018 4763 scope.go:117] "RemoveContainer" containerID="b9c40a0c94f01b80f954c5e2d4d9e146f015e3a541b5faf6a8e54d39775535b7" Sep 30 14:34:08 crc kubenswrapper[4763]: I0930 14:34:08.498261 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e30c51-db28-4445-80a2-50a9cb578aba" path="/var/lib/kubelet/pods/42e30c51-db28-4445-80a2-50a9cb578aba/volumes" Sep 30 14:34:36 crc kubenswrapper[4763]: I0930 14:34:36.059672 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:34:36 crc kubenswrapper[4763]: I0930 14:34:36.060180 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:35:06 crc kubenswrapper[4763]: I0930 14:35:06.059801 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:35:06 crc kubenswrapper[4763]: I0930 14:35:06.060227 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:35:06 crc kubenswrapper[4763]: I0930 14:35:06.060269 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:35:06 crc kubenswrapper[4763]: I0930 14:35:06.060934 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:35:06 crc kubenswrapper[4763]: I0930 14:35:06.060982 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" gracePeriod=600 Sep 30 14:35:06 crc kubenswrapper[4763]: E0930 14:35:06.190549 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:35:06 crc kubenswrapper[4763]: I0930 14:35:06.813003 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" exitCode=0 Sep 30 14:35:06 crc kubenswrapper[4763]: I0930 14:35:06.813041 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c"} Sep 30 14:35:06 crc kubenswrapper[4763]: I0930 14:35:06.813074 4763 scope.go:117] "RemoveContainer" containerID="97ba18998f1cc4e89bcb2eed150426d5971671415b0b392d0f0a8a0ee3ef6eb0" Sep 30 14:35:06 crc kubenswrapper[4763]: I0930 14:35:06.813629 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:35:06 crc kubenswrapper[4763]: E0930 14:35:06.813916 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:35:20 crc kubenswrapper[4763]: I0930 14:35:20.489727 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:35:20 crc kubenswrapper[4763]: E0930 14:35:20.493100 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.060343 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dlbn7"] Sep 30 14:35:25 crc kubenswrapper[4763]: E0930 14:35:25.060914 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e30c51-db28-4445-80a2-50a9cb578aba" containerName="registry-server" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.060926 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e30c51-db28-4445-80a2-50a9cb578aba" containerName="registry-server" Sep 30 14:35:25 crc kubenswrapper[4763]: E0930 14:35:25.060938 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e30c51-db28-4445-80a2-50a9cb578aba" containerName="extract-utilities" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.060944 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e30c51-db28-4445-80a2-50a9cb578aba" containerName="extract-utilities" Sep 30 14:35:25 crc kubenswrapper[4763]: E0930 14:35:25.060960 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e30c51-db28-4445-80a2-50a9cb578aba" containerName="extract-content" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.060965 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e30c51-db28-4445-80a2-50a9cb578aba" containerName="extract-content" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.061106 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e30c51-db28-4445-80a2-50a9cb578aba" containerName="registry-server" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.062098 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.064533 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlbn7"] Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.150205 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22k8c\" (UniqueName: \"kubernetes.io/projected/f46f0210-0293-4404-ba57-b605e0f53fb2-kube-api-access-22k8c\") pod \"certified-operators-dlbn7\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.150468 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-utilities\") pod \"certified-operators-dlbn7\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.150658 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-catalog-content\") pod \"certified-operators-dlbn7\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.251961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-catalog-content\") pod \"certified-operators-dlbn7\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.252087 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22k8c\" (UniqueName: \"kubernetes.io/projected/f46f0210-0293-4404-ba57-b605e0f53fb2-kube-api-access-22k8c\") pod \"certified-operators-dlbn7\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.252109 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-utilities\") pod \"certified-operators-dlbn7\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.252578 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-catalog-content\") pod \"certified-operators-dlbn7\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.252798 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-utilities\") pod \"certified-operators-dlbn7\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.274510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22k8c\" (UniqueName: \"kubernetes.io/projected/f46f0210-0293-4404-ba57-b605e0f53fb2-kube-api-access-22k8c\") pod \"certified-operators-dlbn7\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.421653 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.734465 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlbn7"] Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.966400 4763 generic.go:334] "Generic (PLEG): container finished" podID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerID="07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88" exitCode=0 Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.966453 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlbn7" event={"ID":"f46f0210-0293-4404-ba57-b605e0f53fb2","Type":"ContainerDied","Data":"07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88"} Sep 30 14:35:25 crc kubenswrapper[4763]: I0930 14:35:25.966485 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlbn7" event={"ID":"f46f0210-0293-4404-ba57-b605e0f53fb2","Type":"ContainerStarted","Data":"c2b7a5162123d31c5bb0a7ca5f455f8952cc9ead45a7e476637733982f13a880"} Sep 30 14:35:27 crc kubenswrapper[4763]: I0930 14:35:27.983383 4763 generic.go:334] "Generic (PLEG): container finished" podID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerID="48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c" exitCode=0 Sep 30 14:35:27 crc kubenswrapper[4763]: I0930 14:35:27.983457 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlbn7" event={"ID":"f46f0210-0293-4404-ba57-b605e0f53fb2","Type":"ContainerDied","Data":"48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c"} Sep 30 14:35:28 crc kubenswrapper[4763]: I0930 14:35:28.998425 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlbn7" event={"ID":"f46f0210-0293-4404-ba57-b605e0f53fb2","Type":"ContainerStarted","Data":"05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349"} Sep 30 14:35:29 crc kubenswrapper[4763]: I0930 14:35:29.021657 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dlbn7" podStartSLOduration=1.5694645440000001 podStartE2EDuration="4.021626696s" podCreationTimestamp="2025-09-30 14:35:25 +0000 UTC" firstStartedPulling="2025-09-30 14:35:25.968188912 +0000 UTC m=+3598.106749197" lastFinishedPulling="2025-09-30 14:35:28.420351054 +0000 UTC m=+3600.558911349" observedRunningTime="2025-09-30 14:35:29.020669681 +0000 UTC m=+3601.159229966" watchObservedRunningTime="2025-09-30 14:35:29.021626696 +0000 UTC m=+3601.160186971" Sep 30 14:35:32 crc kubenswrapper[4763]: I0930 14:35:32.489157 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:35:32 crc kubenswrapper[4763]: E0930 14:35:32.489644 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:35:35 crc kubenswrapper[4763]: I0930 14:35:35.422089 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:35 crc kubenswrapper[4763]: I0930 14:35:35.422445 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:35 crc kubenswrapper[4763]: I0930 14:35:35.465359 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:36 crc kubenswrapper[4763]: I0930 14:35:36.100056 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:36 crc kubenswrapper[4763]: I0930 14:35:36.148821 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlbn7"] Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.073867 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dlbn7" podUID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerName="registry-server" containerID="cri-o://05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349" gracePeriod=2 Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.484052 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.544178 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-utilities\") pod \"f46f0210-0293-4404-ba57-b605e0f53fb2\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.544236 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22k8c\" (UniqueName: \"kubernetes.io/projected/f46f0210-0293-4404-ba57-b605e0f53fb2-kube-api-access-22k8c\") pod \"f46f0210-0293-4404-ba57-b605e0f53fb2\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.544280 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-catalog-content\") pod \"f46f0210-0293-4404-ba57-b605e0f53fb2\" (UID: \"f46f0210-0293-4404-ba57-b605e0f53fb2\") " Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.549445 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46f0210-0293-4404-ba57-b605e0f53fb2-kube-api-access-22k8c" (OuterVolumeSpecName: "kube-api-access-22k8c") pod "f46f0210-0293-4404-ba57-b605e0f53fb2" (UID: "f46f0210-0293-4404-ba57-b605e0f53fb2"). InnerVolumeSpecName "kube-api-access-22k8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.556245 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-utilities" (OuterVolumeSpecName: "utilities") pod "f46f0210-0293-4404-ba57-b605e0f53fb2" (UID: "f46f0210-0293-4404-ba57-b605e0f53fb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.590609 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f46f0210-0293-4404-ba57-b605e0f53fb2" (UID: "f46f0210-0293-4404-ba57-b605e0f53fb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.645746 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.645795 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22k8c\" (UniqueName: \"kubernetes.io/projected/f46f0210-0293-4404-ba57-b605e0f53fb2-kube-api-access-22k8c\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:38 crc kubenswrapper[4763]: I0930 14:35:38.645811 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46f0210-0293-4404-ba57-b605e0f53fb2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.083341 4763 generic.go:334] "Generic (PLEG): container finished" podID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerID="05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349" exitCode=0 Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.083387 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlbn7" event={"ID":"f46f0210-0293-4404-ba57-b605e0f53fb2","Type":"ContainerDied","Data":"05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349"} Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.083414 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlbn7" event={"ID":"f46f0210-0293-4404-ba57-b605e0f53fb2","Type":"ContainerDied","Data":"c2b7a5162123d31c5bb0a7ca5f455f8952cc9ead45a7e476637733982f13a880"} Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.083432 4763 scope.go:117] "RemoveContainer" containerID="05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349" Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.083550 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlbn7" Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.106518 4763 scope.go:117] "RemoveContainer" containerID="48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c" Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.118993 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlbn7"] Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.125334 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dlbn7"] Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.139850 4763 scope.go:117] "RemoveContainer" containerID="07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88" Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.167837 4763 scope.go:117] "RemoveContainer" containerID="05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349" Sep 30 14:35:39 crc kubenswrapper[4763]: E0930 14:35:39.168312 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349\": container with ID starting with 05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349 not found: ID does not exist" containerID="05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349" Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.168361 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349"} err="failed to get container status \"05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349\": rpc error: code = NotFound desc = could not find container \"05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349\": container with ID starting with 05cf02b73dca6a556cd0a111dd1dd333fa6acdd8b3f1aa9b7a744985efef1349 not found: ID does not exist" Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.168393 4763 scope.go:117] "RemoveContainer" containerID="48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c" Sep 30 14:35:39 crc kubenswrapper[4763]: E0930 14:35:39.168838 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c\": container with ID starting with 48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c not found: ID does not exist" containerID="48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c" Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.168861 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c"} err="failed to get container status \"48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c\": rpc error: code = NotFound desc = could not find container \"48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c\": container with ID starting with 48d9450caef80733646ba48ae1cc52cec848bcfab06cd55036eb5be37991a93c not found: ID does not exist" Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.168879 4763 scope.go:117] "RemoveContainer" containerID="07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88" Sep 30 14:35:39 crc kubenswrapper[4763]: E0930 14:35:39.169227 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88\": container with ID starting with 07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88 not found: ID does not exist" containerID="07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88" Sep 30 14:35:39 crc kubenswrapper[4763]: I0930 14:35:39.169355 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88"} err="failed to get container status \"07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88\": rpc error: code = NotFound desc = could not find container \"07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88\": container with ID starting with 07529c9e03f4e3de3487a69284e733c71b12d4fce5169100795f0865bd9cbe88 not found: ID does not exist" Sep 30 14:35:40 crc kubenswrapper[4763]: I0930 14:35:40.498418 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46f0210-0293-4404-ba57-b605e0f53fb2" path="/var/lib/kubelet/pods/f46f0210-0293-4404-ba57-b605e0f53fb2/volumes" Sep 30 14:35:43 crc kubenswrapper[4763]: I0930 14:35:43.489862 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:35:43 crc kubenswrapper[4763]: E0930 14:35:43.490116 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.203528 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qlzqc"] Sep 30 14:35:49 crc kubenswrapper[4763]: E0930 14:35:49.204216 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerName="extract-content" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.204235 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerName="extract-content" Sep 30 14:35:49 crc kubenswrapper[4763]: E0930 14:35:49.204250 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerName="registry-server" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.204258 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerName="registry-server" Sep 30 14:35:49 crc kubenswrapper[4763]: E0930 14:35:49.204282 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerName="extract-utilities" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.204292 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerName="extract-utilities" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.204464 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46f0210-0293-4404-ba57-b605e0f53fb2" containerName="registry-server" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.206609 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.209120 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qlzqc"] Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.316279 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-catalog-content\") pod \"community-operators-qlzqc\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.316386 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-utilities\") pod \"community-operators-qlzqc\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.316439 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml4hg\" (UniqueName: \"kubernetes.io/projected/601c7f4e-1398-426d-a180-0c7d85173efa-kube-api-access-ml4hg\") pod \"community-operators-qlzqc\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.418695 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml4hg\" (UniqueName: \"kubernetes.io/projected/601c7f4e-1398-426d-a180-0c7d85173efa-kube-api-access-ml4hg\") pod \"community-operators-qlzqc\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.418788 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-catalog-content\") pod \"community-operators-qlzqc\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.418843 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-utilities\") pod \"community-operators-qlzqc\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.419373 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-utilities\") pod \"community-operators-qlzqc\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.419541 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-catalog-content\") pod \"community-operators-qlzqc\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.440414 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml4hg\" (UniqueName: \"kubernetes.io/projected/601c7f4e-1398-426d-a180-0c7d85173efa-kube-api-access-ml4hg\") pod \"community-operators-qlzqc\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.532586 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:49 crc kubenswrapper[4763]: I0930 14:35:49.809410 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qlzqc"] Sep 30 14:35:50 crc kubenswrapper[4763]: I0930 14:35:50.171209 4763 generic.go:334] "Generic (PLEG): container finished" podID="601c7f4e-1398-426d-a180-0c7d85173efa" containerID="8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c" exitCode=0 Sep 30 14:35:50 crc kubenswrapper[4763]: I0930 14:35:50.171367 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlzqc" event={"ID":"601c7f4e-1398-426d-a180-0c7d85173efa","Type":"ContainerDied","Data":"8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c"} Sep 30 14:35:50 crc kubenswrapper[4763]: I0930 14:35:50.171535 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlzqc" event={"ID":"601c7f4e-1398-426d-a180-0c7d85173efa","Type":"ContainerStarted","Data":"bfcf0f4f4a48aeee0dc8d9c03f9287c9de9dc6c1ddd9d9c2e03623e1c447da0d"} Sep 30 14:35:50 crc kubenswrapper[4763]: I0930 14:35:50.174024 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:35:51 crc kubenswrapper[4763]: I0930 14:35:51.184691 4763 generic.go:334] "Generic (PLEG): container finished" podID="601c7f4e-1398-426d-a180-0c7d85173efa" containerID="5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c" exitCode=0 Sep 30 14:35:51 crc kubenswrapper[4763]: I0930 14:35:51.184811 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlzqc" event={"ID":"601c7f4e-1398-426d-a180-0c7d85173efa","Type":"ContainerDied","Data":"5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c"} Sep 30 14:35:52 crc kubenswrapper[4763]: I0930 14:35:52.194229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlzqc" event={"ID":"601c7f4e-1398-426d-a180-0c7d85173efa","Type":"ContainerStarted","Data":"0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2"} Sep 30 14:35:52 crc kubenswrapper[4763]: I0930 14:35:52.217556 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qlzqc" podStartSLOduration=1.609846198 podStartE2EDuration="3.217536907s" podCreationTimestamp="2025-09-30 14:35:49 +0000 UTC" firstStartedPulling="2025-09-30 14:35:50.173655658 +0000 UTC m=+3622.312215953" lastFinishedPulling="2025-09-30 14:35:51.781346337 +0000 UTC m=+3623.919906662" observedRunningTime="2025-09-30 14:35:52.212122701 +0000 UTC m=+3624.350683006" watchObservedRunningTime="2025-09-30 14:35:52.217536907 +0000 UTC m=+3624.356097182" Sep 30 14:35:55 crc kubenswrapper[4763]: I0930 14:35:55.490052 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:35:55 crc kubenswrapper[4763]: E0930 14:35:55.490786 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:35:59 crc kubenswrapper[4763]: I0930 14:35:59.533808 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:59 crc kubenswrapper[4763]: I0930 14:35:59.535479 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:35:59 crc kubenswrapper[4763]: I0930 14:35:59.579980 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:36:00 crc kubenswrapper[4763]: I0930 14:36:00.299849 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:36:01 crc kubenswrapper[4763]: I0930 14:36:01.189453 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qlzqc"] Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.270145 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qlzqc" podUID="601c7f4e-1398-426d-a180-0c7d85173efa" containerName="registry-server" containerID="cri-o://0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2" gracePeriod=2 Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.651497 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.820136 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml4hg\" (UniqueName: \"kubernetes.io/projected/601c7f4e-1398-426d-a180-0c7d85173efa-kube-api-access-ml4hg\") pod \"601c7f4e-1398-426d-a180-0c7d85173efa\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.820242 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-catalog-content\") pod \"601c7f4e-1398-426d-a180-0c7d85173efa\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.820292 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-utilities\") pod \"601c7f4e-1398-426d-a180-0c7d85173efa\" (UID: \"601c7f4e-1398-426d-a180-0c7d85173efa\") " Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.821245 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-utilities" (OuterVolumeSpecName: "utilities") pod "601c7f4e-1398-426d-a180-0c7d85173efa" (UID: "601c7f4e-1398-426d-a180-0c7d85173efa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.825761 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/601c7f4e-1398-426d-a180-0c7d85173efa-kube-api-access-ml4hg" (OuterVolumeSpecName: "kube-api-access-ml4hg") pod "601c7f4e-1398-426d-a180-0c7d85173efa" (UID: "601c7f4e-1398-426d-a180-0c7d85173efa"). InnerVolumeSpecName "kube-api-access-ml4hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.867019 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "601c7f4e-1398-426d-a180-0c7d85173efa" (UID: "601c7f4e-1398-426d-a180-0c7d85173efa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.922366 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml4hg\" (UniqueName: \"kubernetes.io/projected/601c7f4e-1398-426d-a180-0c7d85173efa-kube-api-access-ml4hg\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.922401 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:02 crc kubenswrapper[4763]: I0930 14:36:02.922415 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/601c7f4e-1398-426d-a180-0c7d85173efa-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.280288 4763 generic.go:334] "Generic (PLEG): container finished" podID="601c7f4e-1398-426d-a180-0c7d85173efa" containerID="0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2" exitCode=0 Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.280338 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlzqc" event={"ID":"601c7f4e-1398-426d-a180-0c7d85173efa","Type":"ContainerDied","Data":"0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2"} Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.280376 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlzqc" event={"ID":"601c7f4e-1398-426d-a180-0c7d85173efa","Type":"ContainerDied","Data":"bfcf0f4f4a48aeee0dc8d9c03f9287c9de9dc6c1ddd9d9c2e03623e1c447da0d"} Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.280398 4763 scope.go:117] "RemoveContainer" containerID="0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2" Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.280404 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlzqc" Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.303481 4763 scope.go:117] "RemoveContainer" containerID="5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c" Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.314104 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qlzqc"] Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.319910 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qlzqc"] Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.338513 4763 scope.go:117] "RemoveContainer" containerID="8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c" Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.353631 4763 scope.go:117] "RemoveContainer" containerID="0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2" Sep 30 14:36:03 crc kubenswrapper[4763]: E0930 14:36:03.354122 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2\": container with ID starting with 0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2 not found: ID does not exist" containerID="0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2" Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.354162 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2"} err="failed to get container status \"0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2\": rpc error: code = NotFound desc = could not find container \"0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2\": container with ID starting with 0624dc8baf774c40b2d447067d909c191018f30b9940d35e62d28a5b25ee7eb2 not found: ID does not exist" Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.354190 4763 scope.go:117] "RemoveContainer" containerID="5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c" Sep 30 14:36:03 crc kubenswrapper[4763]: E0930 14:36:03.354495 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c\": container with ID starting with 5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c not found: ID does not exist" containerID="5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c" Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.354565 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c"} err="failed to get container status \"5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c\": rpc error: code = NotFound desc = could not find container \"5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c\": container with ID starting with 5b166af20807cae2e28a4ffcec09d48b8354212a27dd33399661ee7c9ab7b82c not found: ID does not exist" Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.354614 4763 scope.go:117] "RemoveContainer" containerID="8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c" Sep 30 14:36:03 crc kubenswrapper[4763]: E0930 14:36:03.355103 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c\": container with ID starting with 8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c not found: ID does not exist" containerID="8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c" Sep 30 14:36:03 crc kubenswrapper[4763]: I0930 14:36:03.355141 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c"} err="failed to get container status \"8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c\": rpc error: code = NotFound desc = could not find container \"8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c\": container with ID starting with 8d94463b3fbeefc81089baaac58e6067cff75fd76ad951ddeb5edc9eba54825c not found: ID does not exist" Sep 30 14:36:04 crc kubenswrapper[4763]: I0930 14:36:04.497121 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="601c7f4e-1398-426d-a180-0c7d85173efa" path="/var/lib/kubelet/pods/601c7f4e-1398-426d-a180-0c7d85173efa/volumes" Sep 30 14:36:09 crc kubenswrapper[4763]: I0930 14:36:09.489553 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:36:09 crc kubenswrapper[4763]: E0930 14:36:09.489858 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:36:24 crc kubenswrapper[4763]: I0930 14:36:24.491008 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:36:24 crc kubenswrapper[4763]: E0930 14:36:24.491784 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:36:35 crc kubenswrapper[4763]: I0930 14:36:35.489941 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:36:35 crc kubenswrapper[4763]: E0930 14:36:35.490858 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:36:50 crc kubenswrapper[4763]: I0930 14:36:50.489526 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:36:50 crc kubenswrapper[4763]: E0930 14:36:50.490751 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:37:02 crc kubenswrapper[4763]: I0930 14:37:02.489131 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:37:02 crc kubenswrapper[4763]: E0930 14:37:02.489888 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:37:16 crc kubenswrapper[4763]: I0930 14:37:16.489785 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:37:16 crc kubenswrapper[4763]: E0930 14:37:16.490611 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:37:30 crc kubenswrapper[4763]: I0930 14:37:30.488996 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:37:30 crc kubenswrapper[4763]: E0930 14:37:30.489794 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:37:43 crc kubenswrapper[4763]: I0930 14:37:43.489886 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:37:43 crc kubenswrapper[4763]: E0930 14:37:43.490760 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:37:56 crc kubenswrapper[4763]: I0930 14:37:56.489637 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:37:56 crc kubenswrapper[4763]: E0930 14:37:56.491067 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:38:07 crc kubenswrapper[4763]: I0930 14:38:07.489898 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:38:07 crc kubenswrapper[4763]: E0930 14:38:07.490582 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:38:21 crc kubenswrapper[4763]: I0930 14:38:21.489794 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:38:21 crc kubenswrapper[4763]: E0930 14:38:21.490796 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:38:34 crc kubenswrapper[4763]: I0930 14:38:34.490424 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:38:34 crc kubenswrapper[4763]: E0930 14:38:34.491149 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:38:47 crc kubenswrapper[4763]: I0930 14:38:47.489849 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:38:47 crc kubenswrapper[4763]: E0930 14:38:47.491128 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:38:59 crc kubenswrapper[4763]: I0930 14:38:59.489298 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:38:59 crc kubenswrapper[4763]: E0930 14:38:59.490051 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:39:11 crc kubenswrapper[4763]: I0930 14:39:11.489728 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:39:11 crc kubenswrapper[4763]: E0930 14:39:11.490473 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:39:24 crc kubenswrapper[4763]: I0930 14:39:24.490724 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:39:24 crc kubenswrapper[4763]: E0930 14:39:24.491547 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:39:37 crc kubenswrapper[4763]: I0930 14:39:37.490585 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:39:37 crc kubenswrapper[4763]: E0930 14:39:37.491773 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:39:52 crc kubenswrapper[4763]: I0930 14:39:52.490168 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:39:52 crc kubenswrapper[4763]: E0930 14:39:52.491336 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:40:06 crc kubenswrapper[4763]: I0930 14:40:06.490250 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:40:07 crc kubenswrapper[4763]: I0930 14:40:07.031418 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"8aaebda55471ddac0f8dda1069d5b1da2893a53096db99d7b0dca7f07c1c32ab"} Sep 30 14:41:23 crc kubenswrapper[4763]: I0930 14:41:23.829188 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-snrrt"] Sep 30 14:41:23 crc kubenswrapper[4763]: E0930 14:41:23.830050 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601c7f4e-1398-426d-a180-0c7d85173efa" containerName="registry-server" Sep 30 14:41:23 crc kubenswrapper[4763]: I0930 14:41:23.830066 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="601c7f4e-1398-426d-a180-0c7d85173efa" containerName="registry-server" Sep 30 14:41:23 crc kubenswrapper[4763]: E0930 14:41:23.830082 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601c7f4e-1398-426d-a180-0c7d85173efa" containerName="extract-content" Sep 30 14:41:23 crc kubenswrapper[4763]: I0930 14:41:23.830087 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="601c7f4e-1398-426d-a180-0c7d85173efa" containerName="extract-content" Sep 30 14:41:23 crc kubenswrapper[4763]: E0930 14:41:23.830115 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601c7f4e-1398-426d-a180-0c7d85173efa" containerName="extract-utilities" Sep 30 14:41:23 crc kubenswrapper[4763]: I0930 14:41:23.830121 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="601c7f4e-1398-426d-a180-0c7d85173efa" containerName="extract-utilities" Sep 30 14:41:23 crc kubenswrapper[4763]: I0930 14:41:23.830252 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="601c7f4e-1398-426d-a180-0c7d85173efa" containerName="registry-server" Sep 30 14:41:23 crc kubenswrapper[4763]: I0930 14:41:23.831233 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:23 crc kubenswrapper[4763]: I0930 14:41:23.841787 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snrrt"] Sep 30 14:41:23 crc kubenswrapper[4763]: I0930 14:41:23.962370 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-catalog-content\") pod \"redhat-marketplace-snrrt\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:23 crc kubenswrapper[4763]: I0930 14:41:23.962520 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-utilities\") pod \"redhat-marketplace-snrrt\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:23 crc kubenswrapper[4763]: I0930 14:41:23.962613 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dcmx\" (UniqueName: \"kubernetes.io/projected/8a406c68-d827-433b-9c05-7e88d8944ed7-kube-api-access-7dcmx\") pod \"redhat-marketplace-snrrt\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:24 crc kubenswrapper[4763]: I0930 14:41:24.064387 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-catalog-content\") pod \"redhat-marketplace-snrrt\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:24 crc kubenswrapper[4763]: I0930 14:41:24.064464 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-utilities\") pod \"redhat-marketplace-snrrt\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:24 crc kubenswrapper[4763]: I0930 14:41:24.064498 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dcmx\" (UniqueName: \"kubernetes.io/projected/8a406c68-d827-433b-9c05-7e88d8944ed7-kube-api-access-7dcmx\") pod \"redhat-marketplace-snrrt\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:24 crc kubenswrapper[4763]: I0930 14:41:24.065111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-utilities\") pod \"redhat-marketplace-snrrt\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:24 crc kubenswrapper[4763]: I0930 14:41:24.065311 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-catalog-content\") pod \"redhat-marketplace-snrrt\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:24 crc kubenswrapper[4763]: I0930 14:41:24.082535 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dcmx\" (UniqueName: \"kubernetes.io/projected/8a406c68-d827-433b-9c05-7e88d8944ed7-kube-api-access-7dcmx\") pod \"redhat-marketplace-snrrt\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:24 crc kubenswrapper[4763]: I0930 14:41:24.170639 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:24 crc kubenswrapper[4763]: I0930 14:41:24.576437 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snrrt"] Sep 30 14:41:24 crc kubenswrapper[4763]: I0930 14:41:24.619162 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snrrt" event={"ID":"8a406c68-d827-433b-9c05-7e88d8944ed7","Type":"ContainerStarted","Data":"928f963ccdcd0bf65cd097d548ded47fc3bd8a6dc9debf89841315cdb42fe1b6"} Sep 30 14:41:25 crc kubenswrapper[4763]: I0930 14:41:25.628187 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerID="04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d" exitCode=0 Sep 30 14:41:25 crc kubenswrapper[4763]: I0930 14:41:25.628269 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snrrt" event={"ID":"8a406c68-d827-433b-9c05-7e88d8944ed7","Type":"ContainerDied","Data":"04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d"} Sep 30 14:41:25 crc kubenswrapper[4763]: I0930 14:41:25.630394 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:41:26 crc kubenswrapper[4763]: I0930 14:41:26.639227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snrrt" event={"ID":"8a406c68-d827-433b-9c05-7e88d8944ed7","Type":"ContainerStarted","Data":"8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931"} Sep 30 14:41:27 crc kubenswrapper[4763]: I0930 14:41:27.646965 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerID="8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931" exitCode=0 Sep 30 14:41:27 crc kubenswrapper[4763]: I0930 14:41:27.647011 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snrrt" event={"ID":"8a406c68-d827-433b-9c05-7e88d8944ed7","Type":"ContainerDied","Data":"8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931"} Sep 30 14:41:28 crc kubenswrapper[4763]: I0930 14:41:28.657659 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snrrt" event={"ID":"8a406c68-d827-433b-9c05-7e88d8944ed7","Type":"ContainerStarted","Data":"5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7"} Sep 30 14:41:28 crc kubenswrapper[4763]: I0930 14:41:28.679388 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-snrrt" podStartSLOduration=3.170004876 podStartE2EDuration="5.679373423s" podCreationTimestamp="2025-09-30 14:41:23 +0000 UTC" firstStartedPulling="2025-09-30 14:41:25.630071453 +0000 UTC m=+3957.768631738" lastFinishedPulling="2025-09-30 14:41:28.13944 +0000 UTC m=+3960.278000285" observedRunningTime="2025-09-30 14:41:28.677993429 +0000 UTC m=+3960.816553734" watchObservedRunningTime="2025-09-30 14:41:28.679373423 +0000 UTC m=+3960.817933708" Sep 30 14:41:34 crc kubenswrapper[4763]: I0930 14:41:34.170965 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:34 crc kubenswrapper[4763]: I0930 14:41:34.171306 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:34 crc kubenswrapper[4763]: I0930 14:41:34.212576 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:34 crc kubenswrapper[4763]: I0930 14:41:34.764002 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:34 crc kubenswrapper[4763]: I0930 14:41:34.812328 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snrrt"] Sep 30 14:41:36 crc kubenswrapper[4763]: I0930 14:41:36.717148 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-snrrt" podUID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerName="registry-server" containerID="cri-o://5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7" gracePeriod=2 Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.136556 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.318383 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dcmx\" (UniqueName: \"kubernetes.io/projected/8a406c68-d827-433b-9c05-7e88d8944ed7-kube-api-access-7dcmx\") pod \"8a406c68-d827-433b-9c05-7e88d8944ed7\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.318513 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-utilities\") pod \"8a406c68-d827-433b-9c05-7e88d8944ed7\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.318676 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-catalog-content\") pod \"8a406c68-d827-433b-9c05-7e88d8944ed7\" (UID: \"8a406c68-d827-433b-9c05-7e88d8944ed7\") " Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.320134 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-utilities" (OuterVolumeSpecName: "utilities") pod "8a406c68-d827-433b-9c05-7e88d8944ed7" (UID: "8a406c68-d827-433b-9c05-7e88d8944ed7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.327215 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a406c68-d827-433b-9c05-7e88d8944ed7-kube-api-access-7dcmx" (OuterVolumeSpecName: "kube-api-access-7dcmx") pod "8a406c68-d827-433b-9c05-7e88d8944ed7" (UID: "8a406c68-d827-433b-9c05-7e88d8944ed7"). InnerVolumeSpecName "kube-api-access-7dcmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.334797 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a406c68-d827-433b-9c05-7e88d8944ed7" (UID: "8a406c68-d827-433b-9c05-7e88d8944ed7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.420898 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.420955 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a406c68-d827-433b-9c05-7e88d8944ed7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.420977 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dcmx\" (UniqueName: \"kubernetes.io/projected/8a406c68-d827-433b-9c05-7e88d8944ed7-kube-api-access-7dcmx\") on node \"crc\" DevicePath \"\"" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.727359 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerID="5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7" exitCode=0 Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.727413 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snrrt" event={"ID":"8a406c68-d827-433b-9c05-7e88d8944ed7","Type":"ContainerDied","Data":"5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7"} Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.727449 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snrrt" event={"ID":"8a406c68-d827-433b-9c05-7e88d8944ed7","Type":"ContainerDied","Data":"928f963ccdcd0bf65cd097d548ded47fc3bd8a6dc9debf89841315cdb42fe1b6"} Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.727478 4763 scope.go:117] "RemoveContainer" containerID="5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.727667 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snrrt" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.761782 4763 scope.go:117] "RemoveContainer" containerID="8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.765878 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snrrt"] Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.773342 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-snrrt"] Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.793998 4763 scope.go:117] "RemoveContainer" containerID="04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.818073 4763 scope.go:117] "RemoveContainer" containerID="5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7" Sep 30 14:41:37 crc kubenswrapper[4763]: E0930 14:41:37.818839 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7\": container with ID starting with 5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7 not found: ID does not exist" containerID="5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.818894 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7"} err="failed to get container status \"5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7\": rpc error: code = NotFound desc = could not find container \"5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7\": container with ID starting with 5572b9775d05d32c60331e957be89f762ce2aec0892d0163406408a45fb10dc7 not found: ID does not exist" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.818945 4763 scope.go:117] "RemoveContainer" containerID="8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931" Sep 30 14:41:37 crc kubenswrapper[4763]: E0930 14:41:37.819270 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931\": container with ID starting with 8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931 not found: ID does not exist" containerID="8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.819313 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931"} err="failed to get container status \"8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931\": rpc error: code = NotFound desc = could not find container \"8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931\": container with ID starting with 8a4cf02df68b2a736f2407c4c6c0b483984a3fda1b1f29d3a9d3b76890d6a931 not found: ID does not exist" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.819339 4763 scope.go:117] "RemoveContainer" containerID="04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d" Sep 30 14:41:37 crc kubenswrapper[4763]: E0930 14:41:37.819684 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d\": container with ID starting with 04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d not found: ID does not exist" containerID="04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d" Sep 30 14:41:37 crc kubenswrapper[4763]: I0930 14:41:37.819719 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d"} err="failed to get container status \"04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d\": rpc error: code = NotFound desc = could not find container \"04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d\": container with ID starting with 04c527e085dc5d2491d273722928d754e9d429a6efad94b5ba16ee700373569d not found: ID does not exist" Sep 30 14:41:38 crc kubenswrapper[4763]: I0930 14:41:38.506782 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a406c68-d827-433b-9c05-7e88d8944ed7" path="/var/lib/kubelet/pods/8a406c68-d827-433b-9c05-7e88d8944ed7/volumes" Sep 30 14:42:06 crc kubenswrapper[4763]: I0930 14:42:06.059910 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:42:06 crc kubenswrapper[4763]: I0930 14:42:06.060492 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:42:36 crc kubenswrapper[4763]: I0930 14:42:36.059819 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:42:36 crc kubenswrapper[4763]: I0930 14:42:36.060359 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:43:06 crc kubenswrapper[4763]: I0930 14:43:06.060325 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:43:06 crc kubenswrapper[4763]: I0930 14:43:06.061056 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:43:06 crc kubenswrapper[4763]: I0930 14:43:06.061129 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:43:06 crc kubenswrapper[4763]: I0930 14:43:06.062048 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8aaebda55471ddac0f8dda1069d5b1da2893a53096db99d7b0dca7f07c1c32ab"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:43:06 crc kubenswrapper[4763]: I0930 14:43:06.062148 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://8aaebda55471ddac0f8dda1069d5b1da2893a53096db99d7b0dca7f07c1c32ab" gracePeriod=600 Sep 30 14:43:06 crc kubenswrapper[4763]: I0930 14:43:06.430223 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="8aaebda55471ddac0f8dda1069d5b1da2893a53096db99d7b0dca7f07c1c32ab" exitCode=0 Sep 30 14:43:06 crc kubenswrapper[4763]: I0930 14:43:06.430682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"8aaebda55471ddac0f8dda1069d5b1da2893a53096db99d7b0dca7f07c1c32ab"} Sep 30 14:43:06 crc kubenswrapper[4763]: I0930 14:43:06.430729 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e"} Sep 30 14:43:06 crc kubenswrapper[4763]: I0930 14:43:06.430756 4763 scope.go:117] "RemoveContainer" containerID="9da41777d9260497c17471da493e81f845593310e1024d15508d5bdfc376914c" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.149749 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp"] Sep 30 14:45:00 crc kubenswrapper[4763]: E0930 14:45:00.150540 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerName="extract-content" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.150553 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerName="extract-content" Sep 30 14:45:00 crc kubenswrapper[4763]: E0930 14:45:00.150564 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerName="extract-utilities" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.150570 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerName="extract-utilities" Sep 30 14:45:00 crc kubenswrapper[4763]: E0930 14:45:00.150610 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerName="registry-server" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.150616 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerName="registry-server" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.150752 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a406c68-d827-433b-9c05-7e88d8944ed7" containerName="registry-server" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.151276 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.155232 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.155557 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.171071 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp"] Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.231261 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26gw2\" (UniqueName: \"kubernetes.io/projected/9a01dd29-890a-4a2a-ab00-e96368a9b401-kube-api-access-26gw2\") pod \"collect-profiles-29320725-22kjp\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.231356 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a01dd29-890a-4a2a-ab00-e96368a9b401-config-volume\") pod \"collect-profiles-29320725-22kjp\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.231401 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a01dd29-890a-4a2a-ab00-e96368a9b401-secret-volume\") pod \"collect-profiles-29320725-22kjp\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.332359 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26gw2\" (UniqueName: \"kubernetes.io/projected/9a01dd29-890a-4a2a-ab00-e96368a9b401-kube-api-access-26gw2\") pod \"collect-profiles-29320725-22kjp\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.332430 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a01dd29-890a-4a2a-ab00-e96368a9b401-config-volume\") pod \"collect-profiles-29320725-22kjp\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.332462 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a01dd29-890a-4a2a-ab00-e96368a9b401-secret-volume\") pod \"collect-profiles-29320725-22kjp\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.333820 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a01dd29-890a-4a2a-ab00-e96368a9b401-config-volume\") pod \"collect-profiles-29320725-22kjp\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.348304 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a01dd29-890a-4a2a-ab00-e96368a9b401-secret-volume\") pod \"collect-profiles-29320725-22kjp\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.350440 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26gw2\" (UniqueName: \"kubernetes.io/projected/9a01dd29-890a-4a2a-ab00-e96368a9b401-kube-api-access-26gw2\") pod \"collect-profiles-29320725-22kjp\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.475578 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:00 crc kubenswrapper[4763]: I0930 14:45:00.927747 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp"] Sep 30 14:45:01 crc kubenswrapper[4763]: I0930 14:45:01.297691 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" event={"ID":"9a01dd29-890a-4a2a-ab00-e96368a9b401","Type":"ContainerStarted","Data":"66dde71ef9fce8609a17b29bf5ae2bc62affdb6a47498d8c63d6ef31391a2efd"} Sep 30 14:45:01 crc kubenswrapper[4763]: I0930 14:45:01.298050 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" event={"ID":"9a01dd29-890a-4a2a-ab00-e96368a9b401","Type":"ContainerStarted","Data":"f2bd34364954d88da942d605c24ef8865a7dbbfed473c0065a3c98b09036a8a4"} Sep 30 14:45:01 crc kubenswrapper[4763]: I0930 14:45:01.318435 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" podStartSLOduration=1.318418925 podStartE2EDuration="1.318418925s" podCreationTimestamp="2025-09-30 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:45:01.31420848 +0000 UTC m=+4173.452768775" watchObservedRunningTime="2025-09-30 14:45:01.318418925 +0000 UTC m=+4173.456979210" Sep 30 14:45:02 crc kubenswrapper[4763]: I0930 14:45:02.308218 4763 generic.go:334] "Generic (PLEG): container finished" podID="9a01dd29-890a-4a2a-ab00-e96368a9b401" containerID="66dde71ef9fce8609a17b29bf5ae2bc62affdb6a47498d8c63d6ef31391a2efd" exitCode=0 Sep 30 14:45:02 crc kubenswrapper[4763]: I0930 14:45:02.308262 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" event={"ID":"9a01dd29-890a-4a2a-ab00-e96368a9b401","Type":"ContainerDied","Data":"66dde71ef9fce8609a17b29bf5ae2bc62affdb6a47498d8c63d6ef31391a2efd"} Sep 30 14:45:03 crc kubenswrapper[4763]: I0930 14:45:03.564581 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:03 crc kubenswrapper[4763]: I0930 14:45:03.683086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26gw2\" (UniqueName: \"kubernetes.io/projected/9a01dd29-890a-4a2a-ab00-e96368a9b401-kube-api-access-26gw2\") pod \"9a01dd29-890a-4a2a-ab00-e96368a9b401\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " Sep 30 14:45:03 crc kubenswrapper[4763]: I0930 14:45:03.683249 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a01dd29-890a-4a2a-ab00-e96368a9b401-secret-volume\") pod \"9a01dd29-890a-4a2a-ab00-e96368a9b401\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " Sep 30 14:45:03 crc kubenswrapper[4763]: I0930 14:45:03.683900 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a01dd29-890a-4a2a-ab00-e96368a9b401-config-volume" (OuterVolumeSpecName: "config-volume") pod "9a01dd29-890a-4a2a-ab00-e96368a9b401" (UID: "9a01dd29-890a-4a2a-ab00-e96368a9b401"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:45:03 crc kubenswrapper[4763]: I0930 14:45:03.684020 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a01dd29-890a-4a2a-ab00-e96368a9b401-config-volume\") pod \"9a01dd29-890a-4a2a-ab00-e96368a9b401\" (UID: \"9a01dd29-890a-4a2a-ab00-e96368a9b401\") " Sep 30 14:45:03 crc kubenswrapper[4763]: I0930 14:45:03.684311 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a01dd29-890a-4a2a-ab00-e96368a9b401-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:03 crc kubenswrapper[4763]: I0930 14:45:03.688063 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a01dd29-890a-4a2a-ab00-e96368a9b401-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9a01dd29-890a-4a2a-ab00-e96368a9b401" (UID: "9a01dd29-890a-4a2a-ab00-e96368a9b401"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:45:03 crc kubenswrapper[4763]: I0930 14:45:03.688405 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a01dd29-890a-4a2a-ab00-e96368a9b401-kube-api-access-26gw2" (OuterVolumeSpecName: "kube-api-access-26gw2") pod "9a01dd29-890a-4a2a-ab00-e96368a9b401" (UID: "9a01dd29-890a-4a2a-ab00-e96368a9b401"). InnerVolumeSpecName "kube-api-access-26gw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:45:03 crc kubenswrapper[4763]: I0930 14:45:03.785239 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a01dd29-890a-4a2a-ab00-e96368a9b401-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:03 crc kubenswrapper[4763]: I0930 14:45:03.785557 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26gw2\" (UniqueName: \"kubernetes.io/projected/9a01dd29-890a-4a2a-ab00-e96368a9b401-kube-api-access-26gw2\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:04 crc kubenswrapper[4763]: I0930 14:45:04.330006 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" event={"ID":"9a01dd29-890a-4a2a-ab00-e96368a9b401","Type":"ContainerDied","Data":"f2bd34364954d88da942d605c24ef8865a7dbbfed473c0065a3c98b09036a8a4"} Sep 30 14:45:04 crc kubenswrapper[4763]: I0930 14:45:04.330053 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2bd34364954d88da942d605c24ef8865a7dbbfed473c0065a3c98b09036a8a4" Sep 30 14:45:04 crc kubenswrapper[4763]: I0930 14:45:04.330129 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22kjp" Sep 30 14:45:04 crc kubenswrapper[4763]: I0930 14:45:04.399187 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv"] Sep 30 14:45:04 crc kubenswrapper[4763]: I0930 14:45:04.404028 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-6pfcv"] Sep 30 14:45:04 crc kubenswrapper[4763]: I0930 14:45:04.507976 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2317d90b-e233-4f27-b3bc-f60b3aaea8ea" path="/var/lib/kubelet/pods/2317d90b-e233-4f27-b3bc-f60b3aaea8ea/volumes" Sep 30 14:45:06 crc kubenswrapper[4763]: I0930 14:45:06.059491 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:45:06 crc kubenswrapper[4763]: I0930 14:45:06.060876 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:45:33 crc kubenswrapper[4763]: I0930 14:45:33.538107 4763 scope.go:117] "RemoveContainer" containerID="84edac1b690c46b42d3b51685afd16e7387e24b36a53232269097482ee924af2" Sep 30 14:45:36 crc kubenswrapper[4763]: I0930 14:45:36.059659 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:45:36 crc kubenswrapper[4763]: I0930 14:45:36.060038 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:46:05 crc kubenswrapper[4763]: I0930 14:46:05.768709 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c7bh7"] Sep 30 14:46:05 crc kubenswrapper[4763]: E0930 14:46:05.769728 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a01dd29-890a-4a2a-ab00-e96368a9b401" containerName="collect-profiles" Sep 30 14:46:05 crc kubenswrapper[4763]: I0930 14:46:05.769744 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a01dd29-890a-4a2a-ab00-e96368a9b401" containerName="collect-profiles" Sep 30 14:46:05 crc kubenswrapper[4763]: I0930 14:46:05.769885 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a01dd29-890a-4a2a-ab00-e96368a9b401" containerName="collect-profiles" Sep 30 14:46:05 crc kubenswrapper[4763]: I0930 14:46:05.773024 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:05 crc kubenswrapper[4763]: I0930 14:46:05.785178 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c7bh7"] Sep 30 14:46:05 crc kubenswrapper[4763]: I0930 14:46:05.941066 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-catalog-content\") pod \"certified-operators-c7bh7\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:05 crc kubenswrapper[4763]: I0930 14:46:05.941157 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7vnh\" (UniqueName: \"kubernetes.io/projected/bed257de-5c30-4e39-aea3-8a15671b228a-kube-api-access-g7vnh\") pod \"certified-operators-c7bh7\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:05 crc kubenswrapper[4763]: I0930 14:46:05.941183 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-utilities\") pod \"certified-operators-c7bh7\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.042121 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7vnh\" (UniqueName: \"kubernetes.io/projected/bed257de-5c30-4e39-aea3-8a15671b228a-kube-api-access-g7vnh\") pod \"certified-operators-c7bh7\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.042163 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-utilities\") pod \"certified-operators-c7bh7\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.042244 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-catalog-content\") pod \"certified-operators-c7bh7\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.042787 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-catalog-content\") pod \"certified-operators-c7bh7\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.042908 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-utilities\") pod \"certified-operators-c7bh7\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.061735 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.061802 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.061853 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.062470 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.062532 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" gracePeriod=600 Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.076576 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7vnh\" (UniqueName: \"kubernetes.io/projected/bed257de-5c30-4e39-aea3-8a15671b228a-kube-api-access-g7vnh\") pod \"certified-operators-c7bh7\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.088889 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:06 crc kubenswrapper[4763]: E0930 14:46:06.193207 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.571256 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c7bh7"] Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.819814 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" exitCode=0 Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.819878 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e"} Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.819909 4763 scope.go:117] "RemoveContainer" containerID="8aaebda55471ddac0f8dda1069d5b1da2893a53096db99d7b0dca7f07c1c32ab" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.820384 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:46:06 crc kubenswrapper[4763]: E0930 14:46:06.820577 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.822733 4763 generic.go:334] "Generic (PLEG): container finished" podID="bed257de-5c30-4e39-aea3-8a15671b228a" containerID="411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9" exitCode=0 Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.822763 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bh7" event={"ID":"bed257de-5c30-4e39-aea3-8a15671b228a","Type":"ContainerDied","Data":"411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9"} Sep 30 14:46:06 crc kubenswrapper[4763]: I0930 14:46:06.822787 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bh7" event={"ID":"bed257de-5c30-4e39-aea3-8a15671b228a","Type":"ContainerStarted","Data":"2ec122bf1df664012147e29f3269c5cb45ff38b725ba881807ba0d0f1b02d66e"} Sep 30 14:46:07 crc kubenswrapper[4763]: I0930 14:46:07.832796 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bh7" event={"ID":"bed257de-5c30-4e39-aea3-8a15671b228a","Type":"ContainerStarted","Data":"faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009"} Sep 30 14:46:08 crc kubenswrapper[4763]: I0930 14:46:08.842361 4763 generic.go:334] "Generic (PLEG): container finished" podID="bed257de-5c30-4e39-aea3-8a15671b228a" containerID="faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009" exitCode=0 Sep 30 14:46:08 crc kubenswrapper[4763]: I0930 14:46:08.842428 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bh7" event={"ID":"bed257de-5c30-4e39-aea3-8a15671b228a","Type":"ContainerDied","Data":"faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009"} Sep 30 14:46:09 crc kubenswrapper[4763]: I0930 14:46:09.850436 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bh7" event={"ID":"bed257de-5c30-4e39-aea3-8a15671b228a","Type":"ContainerStarted","Data":"5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e"} Sep 30 14:46:09 crc kubenswrapper[4763]: I0930 14:46:09.873839 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c7bh7" podStartSLOduration=2.428665596 podStartE2EDuration="4.873819387s" podCreationTimestamp="2025-09-30 14:46:05 +0000 UTC" firstStartedPulling="2025-09-30 14:46:06.82485125 +0000 UTC m=+4238.963411525" lastFinishedPulling="2025-09-30 14:46:09.270005031 +0000 UTC m=+4241.408565316" observedRunningTime="2025-09-30 14:46:09.866368742 +0000 UTC m=+4242.004929027" watchObservedRunningTime="2025-09-30 14:46:09.873819387 +0000 UTC m=+4242.012379672" Sep 30 14:46:16 crc kubenswrapper[4763]: I0930 14:46:16.089664 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:16 crc kubenswrapper[4763]: I0930 14:46:16.090143 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:16 crc kubenswrapper[4763]: I0930 14:46:16.142851 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:16 crc kubenswrapper[4763]: I0930 14:46:16.955086 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:16 crc kubenswrapper[4763]: I0930 14:46:16.996857 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c7bh7"] Sep 30 14:46:18 crc kubenswrapper[4763]: I0930 14:46:18.496666 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:46:18 crc kubenswrapper[4763]: E0930 14:46:18.496965 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:46:18 crc kubenswrapper[4763]: I0930 14:46:18.915701 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c7bh7" podUID="bed257de-5c30-4e39-aea3-8a15671b228a" containerName="registry-server" containerID="cri-o://5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e" gracePeriod=2 Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.306276 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.429959 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7vnh\" (UniqueName: \"kubernetes.io/projected/bed257de-5c30-4e39-aea3-8a15671b228a-kube-api-access-g7vnh\") pod \"bed257de-5c30-4e39-aea3-8a15671b228a\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.430045 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-utilities\") pod \"bed257de-5c30-4e39-aea3-8a15671b228a\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.430175 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-catalog-content\") pod \"bed257de-5c30-4e39-aea3-8a15671b228a\" (UID: \"bed257de-5c30-4e39-aea3-8a15671b228a\") " Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.431547 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-utilities" (OuterVolumeSpecName: "utilities") pod "bed257de-5c30-4e39-aea3-8a15671b228a" (UID: "bed257de-5c30-4e39-aea3-8a15671b228a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.437444 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed257de-5c30-4e39-aea3-8a15671b228a-kube-api-access-g7vnh" (OuterVolumeSpecName: "kube-api-access-g7vnh") pod "bed257de-5c30-4e39-aea3-8a15671b228a" (UID: "bed257de-5c30-4e39-aea3-8a15671b228a"). InnerVolumeSpecName "kube-api-access-g7vnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.476992 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bed257de-5c30-4e39-aea3-8a15671b228a" (UID: "bed257de-5c30-4e39-aea3-8a15671b228a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.531548 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.531590 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bed257de-5c30-4e39-aea3-8a15671b228a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.531620 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7vnh\" (UniqueName: \"kubernetes.io/projected/bed257de-5c30-4e39-aea3-8a15671b228a-kube-api-access-g7vnh\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.924802 4763 generic.go:334] "Generic (PLEG): container finished" podID="bed257de-5c30-4e39-aea3-8a15671b228a" containerID="5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e" exitCode=0 Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.924849 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bh7" event={"ID":"bed257de-5c30-4e39-aea3-8a15671b228a","Type":"ContainerDied","Data":"5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e"} Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.924879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bh7" event={"ID":"bed257de-5c30-4e39-aea3-8a15671b228a","Type":"ContainerDied","Data":"2ec122bf1df664012147e29f3269c5cb45ff38b725ba881807ba0d0f1b02d66e"} Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.924851 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7bh7" Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.924900 4763 scope.go:117] "RemoveContainer" containerID="5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e" Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.943665 4763 scope.go:117] "RemoveContainer" containerID="faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009" Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.960056 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c7bh7"] Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.968365 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c7bh7"] Sep 30 14:46:19 crc kubenswrapper[4763]: I0930 14:46:19.982469 4763 scope.go:117] "RemoveContainer" containerID="411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9" Sep 30 14:46:20 crc kubenswrapper[4763]: I0930 14:46:20.001104 4763 scope.go:117] "RemoveContainer" containerID="5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e" Sep 30 14:46:20 crc kubenswrapper[4763]: E0930 14:46:20.001476 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e\": container with ID starting with 5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e not found: ID does not exist" containerID="5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e" Sep 30 14:46:20 crc kubenswrapper[4763]: I0930 14:46:20.001510 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e"} err="failed to get container status \"5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e\": rpc error: code = NotFound desc = could not find container \"5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e\": container with ID starting with 5623462514e9ae8cf0b9332cbad50c57953f5f8af34b6a61e8d9c3979ab7bb1e not found: ID does not exist" Sep 30 14:46:20 crc kubenswrapper[4763]: I0930 14:46:20.001539 4763 scope.go:117] "RemoveContainer" containerID="faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009" Sep 30 14:46:20 crc kubenswrapper[4763]: E0930 14:46:20.001804 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009\": container with ID starting with faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009 not found: ID does not exist" containerID="faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009" Sep 30 14:46:20 crc kubenswrapper[4763]: I0930 14:46:20.001851 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009"} err="failed to get container status \"faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009\": rpc error: code = NotFound desc = could not find container \"faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009\": container with ID starting with faef94856ee92eace5305b40a937feab78442a05ab930f90df826948ea943009 not found: ID does not exist" Sep 30 14:46:20 crc kubenswrapper[4763]: I0930 14:46:20.001887 4763 scope.go:117] "RemoveContainer" containerID="411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9" Sep 30 14:46:20 crc kubenswrapper[4763]: E0930 14:46:20.002223 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9\": container with ID starting with 411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9 not found: ID does not exist" containerID="411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9" Sep 30 14:46:20 crc kubenswrapper[4763]: I0930 14:46:20.002259 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9"} err="failed to get container status \"411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9\": rpc error: code = NotFound desc = could not find container \"411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9\": container with ID starting with 411256cbd12eef6ede31367a20fe4bf4f774deb23714fe31f1ec6b36ecdb99a9 not found: ID does not exist" Sep 30 14:46:20 crc kubenswrapper[4763]: I0930 14:46:20.508176 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed257de-5c30-4e39-aea3-8a15671b228a" path="/var/lib/kubelet/pods/bed257de-5c30-4e39-aea3-8a15671b228a/volumes" Sep 30 14:46:33 crc kubenswrapper[4763]: I0930 14:46:33.490704 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:46:33 crc kubenswrapper[4763]: E0930 14:46:33.491977 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:46:47 crc kubenswrapper[4763]: I0930 14:46:47.489504 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:46:47 crc kubenswrapper[4763]: E0930 14:46:47.490398 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:47:00 crc kubenswrapper[4763]: I0930 14:47:00.976577 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5cj42"] Sep 30 14:47:00 crc kubenswrapper[4763]: E0930 14:47:00.977520 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed257de-5c30-4e39-aea3-8a15671b228a" containerName="extract-content" Sep 30 14:47:00 crc kubenswrapper[4763]: I0930 14:47:00.977537 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed257de-5c30-4e39-aea3-8a15671b228a" containerName="extract-content" Sep 30 14:47:00 crc kubenswrapper[4763]: E0930 14:47:00.977566 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed257de-5c30-4e39-aea3-8a15671b228a" containerName="registry-server" Sep 30 14:47:00 crc kubenswrapper[4763]: I0930 14:47:00.977573 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed257de-5c30-4e39-aea3-8a15671b228a" containerName="registry-server" Sep 30 14:47:00 crc kubenswrapper[4763]: E0930 14:47:00.977584 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed257de-5c30-4e39-aea3-8a15671b228a" containerName="extract-utilities" Sep 30 14:47:00 crc kubenswrapper[4763]: I0930 14:47:00.977591 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed257de-5c30-4e39-aea3-8a15671b228a" containerName="extract-utilities" Sep 30 14:47:00 crc kubenswrapper[4763]: I0930 14:47:00.977836 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed257de-5c30-4e39-aea3-8a15671b228a" containerName="registry-server" Sep 30 14:47:00 crc kubenswrapper[4763]: I0930 14:47:00.979015 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:00 crc kubenswrapper[4763]: I0930 14:47:00.993047 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5cj42"] Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.113172 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-catalog-content\") pod \"community-operators-5cj42\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.113232 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-utilities\") pod \"community-operators-5cj42\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.113267 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvsg5\" (UniqueName: \"kubernetes.io/projected/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-kube-api-access-qvsg5\") pod \"community-operators-5cj42\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.215261 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-catalog-content\") pod \"community-operators-5cj42\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.215341 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-utilities\") pod \"community-operators-5cj42\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.215379 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvsg5\" (UniqueName: \"kubernetes.io/projected/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-kube-api-access-qvsg5\") pod \"community-operators-5cj42\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.216174 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-catalog-content\") pod \"community-operators-5cj42\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.216174 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-utilities\") pod \"community-operators-5cj42\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.237534 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvsg5\" (UniqueName: \"kubernetes.io/projected/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-kube-api-access-qvsg5\") pod \"community-operators-5cj42\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.298259 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.489526 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:47:01 crc kubenswrapper[4763]: E0930 14:47:01.489893 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:47:01 crc kubenswrapper[4763]: I0930 14:47:01.798537 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5cj42"] Sep 30 14:47:02 crc kubenswrapper[4763]: I0930 14:47:02.235022 4763 generic.go:334] "Generic (PLEG): container finished" podID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerID="e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5" exitCode=0 Sep 30 14:47:02 crc kubenswrapper[4763]: I0930 14:47:02.235061 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cj42" event={"ID":"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d","Type":"ContainerDied","Data":"e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5"} Sep 30 14:47:02 crc kubenswrapper[4763]: I0930 14:47:02.235085 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cj42" event={"ID":"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d","Type":"ContainerStarted","Data":"93fbe83a737c9bb7a46eec8c1ce5c6226a07ab6cfb0c2ef3032d4b14b193fc55"} Sep 30 14:47:02 crc kubenswrapper[4763]: I0930 14:47:02.237232 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:47:04 crc kubenswrapper[4763]: I0930 14:47:04.264577 4763 generic.go:334] "Generic (PLEG): container finished" podID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerID="9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945" exitCode=0 Sep 30 14:47:04 crc kubenswrapper[4763]: I0930 14:47:04.264672 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cj42" event={"ID":"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d","Type":"ContainerDied","Data":"9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945"} Sep 30 14:47:05 crc kubenswrapper[4763]: I0930 14:47:05.275182 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cj42" event={"ID":"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d","Type":"ContainerStarted","Data":"139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671"} Sep 30 14:47:05 crc kubenswrapper[4763]: I0930 14:47:05.293581 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5cj42" podStartSLOduration=2.7923238379999997 podStartE2EDuration="5.293562798s" podCreationTimestamp="2025-09-30 14:47:00 +0000 UTC" firstStartedPulling="2025-09-30 14:47:02.23696533 +0000 UTC m=+4294.375525615" lastFinishedPulling="2025-09-30 14:47:04.73820429 +0000 UTC m=+4296.876764575" observedRunningTime="2025-09-30 14:47:05.2900519 +0000 UTC m=+4297.428612195" watchObservedRunningTime="2025-09-30 14:47:05.293562798 +0000 UTC m=+4297.432123093" Sep 30 14:47:11 crc kubenswrapper[4763]: I0930 14:47:11.298910 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:11 crc kubenswrapper[4763]: I0930 14:47:11.299577 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:11 crc kubenswrapper[4763]: I0930 14:47:11.344511 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:11 crc kubenswrapper[4763]: I0930 14:47:11.386029 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:11 crc kubenswrapper[4763]: I0930 14:47:11.582851 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5cj42"] Sep 30 14:47:12 crc kubenswrapper[4763]: I0930 14:47:12.489936 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:47:12 crc kubenswrapper[4763]: E0930 14:47:12.490146 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:47:13 crc kubenswrapper[4763]: I0930 14:47:13.345528 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5cj42" podUID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerName="registry-server" containerID="cri-o://139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671" gracePeriod=2 Sep 30 14:47:13 crc kubenswrapper[4763]: I0930 14:47:13.776724 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:13 crc kubenswrapper[4763]: I0930 14:47:13.904971 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvsg5\" (UniqueName: \"kubernetes.io/projected/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-kube-api-access-qvsg5\") pod \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " Sep 30 14:47:13 crc kubenswrapper[4763]: I0930 14:47:13.905443 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-catalog-content\") pod \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " Sep 30 14:47:13 crc kubenswrapper[4763]: I0930 14:47:13.905491 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-utilities\") pod \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\" (UID: \"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d\") " Sep 30 14:47:13 crc kubenswrapper[4763]: I0930 14:47:13.907125 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-utilities" (OuterVolumeSpecName: "utilities") pod "1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" (UID: "1180893b-f3f2-4dc3-8fb7-0531b9e54b0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:47:13 crc kubenswrapper[4763]: I0930 14:47:13.913493 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-kube-api-access-qvsg5" (OuterVolumeSpecName: "kube-api-access-qvsg5") pod "1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" (UID: "1180893b-f3f2-4dc3-8fb7-0531b9e54b0d"). InnerVolumeSpecName "kube-api-access-qvsg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:47:13 crc kubenswrapper[4763]: I0930 14:47:13.980326 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" (UID: "1180893b-f3f2-4dc3-8fb7-0531b9e54b0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.007535 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvsg5\" (UniqueName: \"kubernetes.io/projected/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-kube-api-access-qvsg5\") on node \"crc\" DevicePath \"\"" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.007588 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.007625 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.355899 4763 generic.go:334] "Generic (PLEG): container finished" podID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerID="139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671" exitCode=0 Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.355981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cj42" event={"ID":"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d","Type":"ContainerDied","Data":"139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671"} Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.356271 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cj42" event={"ID":"1180893b-f3f2-4dc3-8fb7-0531b9e54b0d","Type":"ContainerDied","Data":"93fbe83a737c9bb7a46eec8c1ce5c6226a07ab6cfb0c2ef3032d4b14b193fc55"} Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.356009 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cj42" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.356401 4763 scope.go:117] "RemoveContainer" containerID="139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.391144 4763 scope.go:117] "RemoveContainer" containerID="9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.395088 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5cj42"] Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.402069 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5cj42"] Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.416365 4763 scope.go:117] "RemoveContainer" containerID="e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.451104 4763 scope.go:117] "RemoveContainer" containerID="139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671" Sep 30 14:47:14 crc kubenswrapper[4763]: E0930 14:47:14.451717 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671\": container with ID starting with 139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671 not found: ID does not exist" containerID="139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.451765 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671"} err="failed to get container status \"139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671\": rpc error: code = NotFound desc = could not find container \"139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671\": container with ID starting with 139b9ebc536e2ff39729ba59f5d4e4b5cf13d1c9ac3c50aa4a702440d2b37671 not found: ID does not exist" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.451795 4763 scope.go:117] "RemoveContainer" containerID="9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945" Sep 30 14:47:14 crc kubenswrapper[4763]: E0930 14:47:14.452147 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945\": container with ID starting with 9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945 not found: ID does not exist" containerID="9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.452164 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945"} err="failed to get container status \"9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945\": rpc error: code = NotFound desc = could not find container \"9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945\": container with ID starting with 9c8f6da3370c90c0965821237ed8e3f44545a274ce1730c39c766a4ee4c87945 not found: ID does not exist" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.452178 4763 scope.go:117] "RemoveContainer" containerID="e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5" Sep 30 14:47:14 crc kubenswrapper[4763]: E0930 14:47:14.452416 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5\": container with ID starting with e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5 not found: ID does not exist" containerID="e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.452432 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5"} err="failed to get container status \"e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5\": rpc error: code = NotFound desc = could not find container \"e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5\": container with ID starting with e88573e649a74cc06b52665d07c10b96d5c4ac76181fef0d70e3096170db4fa5 not found: ID does not exist" Sep 30 14:47:14 crc kubenswrapper[4763]: I0930 14:47:14.500667 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" path="/var/lib/kubelet/pods/1180893b-f3f2-4dc3-8fb7-0531b9e54b0d/volumes" Sep 30 14:47:24 crc kubenswrapper[4763]: I0930 14:47:24.489381 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:47:24 crc kubenswrapper[4763]: E0930 14:47:24.490120 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:47:39 crc kubenswrapper[4763]: I0930 14:47:39.490400 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:47:39 crc kubenswrapper[4763]: E0930 14:47:39.491755 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:47:54 crc kubenswrapper[4763]: I0930 14:47:54.489649 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:47:54 crc kubenswrapper[4763]: E0930 14:47:54.490385 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:48:07 crc kubenswrapper[4763]: I0930 14:48:07.489923 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:48:07 crc kubenswrapper[4763]: E0930 14:48:07.490836 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:48:22 crc kubenswrapper[4763]: I0930 14:48:22.489158 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:48:22 crc kubenswrapper[4763]: E0930 14:48:22.489995 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:48:35 crc kubenswrapper[4763]: I0930 14:48:35.489972 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:48:35 crc kubenswrapper[4763]: E0930 14:48:35.490919 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:48:48 crc kubenswrapper[4763]: I0930 14:48:48.494107 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:48:48 crc kubenswrapper[4763]: E0930 14:48:48.495141 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:49:02 crc kubenswrapper[4763]: I0930 14:49:02.489634 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:49:02 crc kubenswrapper[4763]: E0930 14:49:02.490652 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:49:17 crc kubenswrapper[4763]: I0930 14:49:17.489386 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:49:17 crc kubenswrapper[4763]: E0930 14:49:17.489913 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:49:29 crc kubenswrapper[4763]: I0930 14:49:29.489643 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:49:29 crc kubenswrapper[4763]: E0930 14:49:29.490664 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.111190 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-94rm8"] Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.115666 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-94rm8"] Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.247026 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-hxqh4"] Sep 30 14:49:31 crc kubenswrapper[4763]: E0930 14:49:31.247466 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerName="registry-server" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.247494 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerName="registry-server" Sep 30 14:49:31 crc kubenswrapper[4763]: E0930 14:49:31.247520 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerName="extract-content" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.247534 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerName="extract-content" Sep 30 14:49:31 crc kubenswrapper[4763]: E0930 14:49:31.247562 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerName="extract-utilities" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.247572 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerName="extract-utilities" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.247819 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1180893b-f3f2-4dc3-8fb7-0531b9e54b0d" containerName="registry-server" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.248564 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.253823 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.253874 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.253955 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.254384 4763 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-jrp26" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.261088 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-hxqh4"] Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.388576 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fggqs\" (UniqueName: \"kubernetes.io/projected/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-kube-api-access-fggqs\") pod \"crc-storage-crc-hxqh4\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.388699 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-node-mnt\") pod \"crc-storage-crc-hxqh4\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.388728 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-crc-storage\") pod \"crc-storage-crc-hxqh4\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.489712 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-node-mnt\") pod \"crc-storage-crc-hxqh4\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.490099 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-crc-storage\") pod \"crc-storage-crc-hxqh4\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.490198 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-node-mnt\") pod \"crc-storage-crc-hxqh4\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.490357 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fggqs\" (UniqueName: \"kubernetes.io/projected/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-kube-api-access-fggqs\") pod \"crc-storage-crc-hxqh4\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.491198 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-crc-storage\") pod \"crc-storage-crc-hxqh4\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.514863 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fggqs\" (UniqueName: \"kubernetes.io/projected/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-kube-api-access-fggqs\") pod \"crc-storage-crc-hxqh4\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:31 crc kubenswrapper[4763]: I0930 14:49:31.570372 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:32 crc kubenswrapper[4763]: I0930 14:49:32.001107 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-hxqh4"] Sep 30 14:49:32 crc kubenswrapper[4763]: I0930 14:49:32.372019 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-hxqh4" event={"ID":"b53bafc9-45e7-4b00-ba54-9adbf9ee7456","Type":"ContainerStarted","Data":"05b1dfde27d8e1a38bdafd09ae6e9a8d518d4d942d3de40dc93b49488774d964"} Sep 30 14:49:32 crc kubenswrapper[4763]: I0930 14:49:32.509074 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fae1a56-54ac-419f-8c7a-8230786d5188" path="/var/lib/kubelet/pods/4fae1a56-54ac-419f-8c7a-8230786d5188/volumes" Sep 30 14:49:33 crc kubenswrapper[4763]: I0930 14:49:33.383723 4763 generic.go:334] "Generic (PLEG): container finished" podID="b53bafc9-45e7-4b00-ba54-9adbf9ee7456" containerID="917e9ea7f5ee9f6190661c5e05c014f6c2b24ae163c3eda2ec4717ee524783a3" exitCode=0 Sep 30 14:49:33 crc kubenswrapper[4763]: I0930 14:49:33.384058 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-hxqh4" event={"ID":"b53bafc9-45e7-4b00-ba54-9adbf9ee7456","Type":"ContainerDied","Data":"917e9ea7f5ee9f6190661c5e05c014f6c2b24ae163c3eda2ec4717ee524783a3"} Sep 30 14:49:33 crc kubenswrapper[4763]: I0930 14:49:33.897741 4763 scope.go:117] "RemoveContainer" containerID="90d2b0cf3fcdf2dc227081ea734b8b59d4df1002ff947c97d32d3c7ea42c767f" Sep 30 14:49:34 crc kubenswrapper[4763]: I0930 14:49:34.862299 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.042321 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-node-mnt\") pod \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.042474 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b53bafc9-45e7-4b00-ba54-9adbf9ee7456" (UID: "b53bafc9-45e7-4b00-ba54-9adbf9ee7456"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.042496 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fggqs\" (UniqueName: \"kubernetes.io/projected/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-kube-api-access-fggqs\") pod \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.042576 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-crc-storage\") pod \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\" (UID: \"b53bafc9-45e7-4b00-ba54-9adbf9ee7456\") " Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.043227 4763 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-node-mnt\") on node \"crc\" DevicePath \"\"" Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.046780 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-kube-api-access-fggqs" (OuterVolumeSpecName: "kube-api-access-fggqs") pod "b53bafc9-45e7-4b00-ba54-9adbf9ee7456" (UID: "b53bafc9-45e7-4b00-ba54-9adbf9ee7456"). InnerVolumeSpecName "kube-api-access-fggqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.059976 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b53bafc9-45e7-4b00-ba54-9adbf9ee7456" (UID: "b53bafc9-45e7-4b00-ba54-9adbf9ee7456"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.144465 4763 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-crc-storage\") on node \"crc\" DevicePath \"\"" Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.144498 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fggqs\" (UniqueName: \"kubernetes.io/projected/b53bafc9-45e7-4b00-ba54-9adbf9ee7456-kube-api-access-fggqs\") on node \"crc\" DevicePath \"\"" Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.404458 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-hxqh4" event={"ID":"b53bafc9-45e7-4b00-ba54-9adbf9ee7456","Type":"ContainerDied","Data":"05b1dfde27d8e1a38bdafd09ae6e9a8d518d4d942d3de40dc93b49488774d964"} Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.404506 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05b1dfde27d8e1a38bdafd09ae6e9a8d518d4d942d3de40dc93b49488774d964" Sep 30 14:49:35 crc kubenswrapper[4763]: I0930 14:49:35.404535 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hxqh4" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.100654 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-hxqh4"] Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.108168 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-hxqh4"] Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.258297 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tdpd9"] Sep 30 14:49:37 crc kubenswrapper[4763]: E0930 14:49:37.258660 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53bafc9-45e7-4b00-ba54-9adbf9ee7456" containerName="storage" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.258678 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53bafc9-45e7-4b00-ba54-9adbf9ee7456" containerName="storage" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.258871 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53bafc9-45e7-4b00-ba54-9adbf9ee7456" containerName="storage" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.259444 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.268478 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.268533 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.269151 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.269513 4763 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-jrp26" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.276186 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/feb2e36d-e4be-4222-8d96-9086d44150cc-crc-storage\") pod \"crc-storage-crc-tdpd9\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.276370 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/feb2e36d-e4be-4222-8d96-9086d44150cc-node-mnt\") pod \"crc-storage-crc-tdpd9\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.276496 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mjmx\" (UniqueName: \"kubernetes.io/projected/feb2e36d-e4be-4222-8d96-9086d44150cc-kube-api-access-4mjmx\") pod \"crc-storage-crc-tdpd9\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.279981 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tdpd9"] Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.377108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/feb2e36d-e4be-4222-8d96-9086d44150cc-crc-storage\") pod \"crc-storage-crc-tdpd9\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.377158 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/feb2e36d-e4be-4222-8d96-9086d44150cc-node-mnt\") pod \"crc-storage-crc-tdpd9\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.377193 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mjmx\" (UniqueName: \"kubernetes.io/projected/feb2e36d-e4be-4222-8d96-9086d44150cc-kube-api-access-4mjmx\") pod \"crc-storage-crc-tdpd9\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.377701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/feb2e36d-e4be-4222-8d96-9086d44150cc-node-mnt\") pod \"crc-storage-crc-tdpd9\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.378121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/feb2e36d-e4be-4222-8d96-9086d44150cc-crc-storage\") pod \"crc-storage-crc-tdpd9\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.398024 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mjmx\" (UniqueName: \"kubernetes.io/projected/feb2e36d-e4be-4222-8d96-9086d44150cc-kube-api-access-4mjmx\") pod \"crc-storage-crc-tdpd9\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.577324 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:37 crc kubenswrapper[4763]: I0930 14:49:37.987385 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tdpd9"] Sep 30 14:49:38 crc kubenswrapper[4763]: I0930 14:49:38.428293 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tdpd9" event={"ID":"feb2e36d-e4be-4222-8d96-9086d44150cc","Type":"ContainerStarted","Data":"a8ddf1eb41478ccb71dec93240e8dd862f4a44bb0515d5db201cdfb3ad41fe35"} Sep 30 14:49:38 crc kubenswrapper[4763]: I0930 14:49:38.501492 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b53bafc9-45e7-4b00-ba54-9adbf9ee7456" path="/var/lib/kubelet/pods/b53bafc9-45e7-4b00-ba54-9adbf9ee7456/volumes" Sep 30 14:49:39 crc kubenswrapper[4763]: I0930 14:49:39.442264 4763 generic.go:334] "Generic (PLEG): container finished" podID="feb2e36d-e4be-4222-8d96-9086d44150cc" containerID="ae375c23f4a1184a70e7116acba6a8d2a8a93e6f55477b8dcb9159961cd5ffce" exitCode=0 Sep 30 14:49:39 crc kubenswrapper[4763]: I0930 14:49:39.442509 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tdpd9" event={"ID":"feb2e36d-e4be-4222-8d96-9086d44150cc","Type":"ContainerDied","Data":"ae375c23f4a1184a70e7116acba6a8d2a8a93e6f55477b8dcb9159961cd5ffce"} Sep 30 14:49:40 crc kubenswrapper[4763]: I0930 14:49:40.786964 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:40 crc kubenswrapper[4763]: I0930 14:49:40.924322 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/feb2e36d-e4be-4222-8d96-9086d44150cc-crc-storage\") pod \"feb2e36d-e4be-4222-8d96-9086d44150cc\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " Sep 30 14:49:40 crc kubenswrapper[4763]: I0930 14:49:40.924509 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mjmx\" (UniqueName: \"kubernetes.io/projected/feb2e36d-e4be-4222-8d96-9086d44150cc-kube-api-access-4mjmx\") pod \"feb2e36d-e4be-4222-8d96-9086d44150cc\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " Sep 30 14:49:40 crc kubenswrapper[4763]: I0930 14:49:40.924574 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/feb2e36d-e4be-4222-8d96-9086d44150cc-node-mnt\") pod \"feb2e36d-e4be-4222-8d96-9086d44150cc\" (UID: \"feb2e36d-e4be-4222-8d96-9086d44150cc\") " Sep 30 14:49:40 crc kubenswrapper[4763]: I0930 14:49:40.924799 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/feb2e36d-e4be-4222-8d96-9086d44150cc-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "feb2e36d-e4be-4222-8d96-9086d44150cc" (UID: "feb2e36d-e4be-4222-8d96-9086d44150cc"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:49:40 crc kubenswrapper[4763]: I0930 14:49:40.924995 4763 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/feb2e36d-e4be-4222-8d96-9086d44150cc-node-mnt\") on node \"crc\" DevicePath \"\"" Sep 30 14:49:40 crc kubenswrapper[4763]: I0930 14:49:40.931055 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb2e36d-e4be-4222-8d96-9086d44150cc-kube-api-access-4mjmx" (OuterVolumeSpecName: "kube-api-access-4mjmx") pod "feb2e36d-e4be-4222-8d96-9086d44150cc" (UID: "feb2e36d-e4be-4222-8d96-9086d44150cc"). InnerVolumeSpecName "kube-api-access-4mjmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:49:40 crc kubenswrapper[4763]: I0930 14:49:40.942764 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feb2e36d-e4be-4222-8d96-9086d44150cc-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "feb2e36d-e4be-4222-8d96-9086d44150cc" (UID: "feb2e36d-e4be-4222-8d96-9086d44150cc"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:49:41 crc kubenswrapper[4763]: I0930 14:49:41.027183 4763 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/feb2e36d-e4be-4222-8d96-9086d44150cc-crc-storage\") on node \"crc\" DevicePath \"\"" Sep 30 14:49:41 crc kubenswrapper[4763]: I0930 14:49:41.027276 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mjmx\" (UniqueName: \"kubernetes.io/projected/feb2e36d-e4be-4222-8d96-9086d44150cc-kube-api-access-4mjmx\") on node \"crc\" DevicePath \"\"" Sep 30 14:49:41 crc kubenswrapper[4763]: I0930 14:49:41.461313 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tdpd9" event={"ID":"feb2e36d-e4be-4222-8d96-9086d44150cc","Type":"ContainerDied","Data":"a8ddf1eb41478ccb71dec93240e8dd862f4a44bb0515d5db201cdfb3ad41fe35"} Sep 30 14:49:41 crc kubenswrapper[4763]: I0930 14:49:41.461359 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8ddf1eb41478ccb71dec93240e8dd862f4a44bb0515d5db201cdfb3ad41fe35" Sep 30 14:49:41 crc kubenswrapper[4763]: I0930 14:49:41.461403 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tdpd9" Sep 30 14:49:43 crc kubenswrapper[4763]: I0930 14:49:43.489230 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:49:43 crc kubenswrapper[4763]: E0930 14:49:43.489733 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:49:58 crc kubenswrapper[4763]: I0930 14:49:58.493303 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:49:58 crc kubenswrapper[4763]: E0930 14:49:58.493964 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:50:09 crc kubenswrapper[4763]: I0930 14:50:09.489623 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:50:09 crc kubenswrapper[4763]: E0930 14:50:09.490312 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:50:17 crc kubenswrapper[4763]: I0930 14:50:17.914749 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-56r4p"] Sep 30 14:50:17 crc kubenswrapper[4763]: E0930 14:50:17.919052 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb2e36d-e4be-4222-8d96-9086d44150cc" containerName="storage" Sep 30 14:50:17 crc kubenswrapper[4763]: I0930 14:50:17.919091 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb2e36d-e4be-4222-8d96-9086d44150cc" containerName="storage" Sep 30 14:50:17 crc kubenswrapper[4763]: I0930 14:50:17.919317 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb2e36d-e4be-4222-8d96-9086d44150cc" containerName="storage" Sep 30 14:50:17 crc kubenswrapper[4763]: I0930 14:50:17.920576 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:17 crc kubenswrapper[4763]: I0930 14:50:17.928208 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56r4p"] Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.071492 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gm87\" (UniqueName: \"kubernetes.io/projected/3bd23bf8-835b-43be-8a06-19be86eb2609-kube-api-access-4gm87\") pod \"redhat-operators-56r4p\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.071581 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-utilities\") pod \"redhat-operators-56r4p\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.071755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-catalog-content\") pod \"redhat-operators-56r4p\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.173399 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gm87\" (UniqueName: \"kubernetes.io/projected/3bd23bf8-835b-43be-8a06-19be86eb2609-kube-api-access-4gm87\") pod \"redhat-operators-56r4p\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.173462 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-utilities\") pod \"redhat-operators-56r4p\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.173494 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-catalog-content\") pod \"redhat-operators-56r4p\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.174010 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-catalog-content\") pod \"redhat-operators-56r4p\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.174232 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-utilities\") pod \"redhat-operators-56r4p\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.191716 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gm87\" (UniqueName: \"kubernetes.io/projected/3bd23bf8-835b-43be-8a06-19be86eb2609-kube-api-access-4gm87\") pod \"redhat-operators-56r4p\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.242743 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.526599 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56r4p"] Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.734760 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56r4p" event={"ID":"3bd23bf8-835b-43be-8a06-19be86eb2609","Type":"ContainerStarted","Data":"6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55"} Sep 30 14:50:18 crc kubenswrapper[4763]: I0930 14:50:18.734810 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56r4p" event={"ID":"3bd23bf8-835b-43be-8a06-19be86eb2609","Type":"ContainerStarted","Data":"515342f82d28f17a665f5d11e11b96bce9e42ae8743cbaaa5348a6024f4b47e0"} Sep 30 14:50:19 crc kubenswrapper[4763]: I0930 14:50:19.745589 4763 generic.go:334] "Generic (PLEG): container finished" podID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerID="6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55" exitCode=0 Sep 30 14:50:19 crc kubenswrapper[4763]: I0930 14:50:19.745767 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56r4p" event={"ID":"3bd23bf8-835b-43be-8a06-19be86eb2609","Type":"ContainerDied","Data":"6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55"} Sep 30 14:50:21 crc kubenswrapper[4763]: I0930 14:50:21.765860 4763 generic.go:334] "Generic (PLEG): container finished" podID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerID="971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21" exitCode=0 Sep 30 14:50:21 crc kubenswrapper[4763]: I0930 14:50:21.765966 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56r4p" event={"ID":"3bd23bf8-835b-43be-8a06-19be86eb2609","Type":"ContainerDied","Data":"971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21"} Sep 30 14:50:22 crc kubenswrapper[4763]: I0930 14:50:22.489842 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:50:22 crc kubenswrapper[4763]: E0930 14:50:22.490077 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:50:23 crc kubenswrapper[4763]: I0930 14:50:23.781745 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56r4p" event={"ID":"3bd23bf8-835b-43be-8a06-19be86eb2609","Type":"ContainerStarted","Data":"7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e"} Sep 30 14:50:23 crc kubenswrapper[4763]: I0930 14:50:23.802141 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-56r4p" podStartSLOduration=3.7213643 podStartE2EDuration="6.80212266s" podCreationTimestamp="2025-09-30 14:50:17 +0000 UTC" firstStartedPulling="2025-09-30 14:50:19.748031069 +0000 UTC m=+4491.886591354" lastFinishedPulling="2025-09-30 14:50:22.828789429 +0000 UTC m=+4494.967349714" observedRunningTime="2025-09-30 14:50:23.799192777 +0000 UTC m=+4495.937753092" watchObservedRunningTime="2025-09-30 14:50:23.80212266 +0000 UTC m=+4495.940682965" Sep 30 14:50:28 crc kubenswrapper[4763]: I0930 14:50:28.243319 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:28 crc kubenswrapper[4763]: I0930 14:50:28.243569 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:28 crc kubenswrapper[4763]: I0930 14:50:28.296073 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:28 crc kubenswrapper[4763]: I0930 14:50:28.859758 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:28 crc kubenswrapper[4763]: I0930 14:50:28.901183 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56r4p"] Sep 30 14:50:30 crc kubenswrapper[4763]: I0930 14:50:30.831906 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-56r4p" podUID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerName="registry-server" containerID="cri-o://7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e" gracePeriod=2 Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.210838 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.368988 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-utilities\") pod \"3bd23bf8-835b-43be-8a06-19be86eb2609\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.369141 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gm87\" (UniqueName: \"kubernetes.io/projected/3bd23bf8-835b-43be-8a06-19be86eb2609-kube-api-access-4gm87\") pod \"3bd23bf8-835b-43be-8a06-19be86eb2609\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.370005 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-catalog-content\") pod \"3bd23bf8-835b-43be-8a06-19be86eb2609\" (UID: \"3bd23bf8-835b-43be-8a06-19be86eb2609\") " Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.370039 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-utilities" (OuterVolumeSpecName: "utilities") pod "3bd23bf8-835b-43be-8a06-19be86eb2609" (UID: "3bd23bf8-835b-43be-8a06-19be86eb2609"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.370265 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.375869 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd23bf8-835b-43be-8a06-19be86eb2609-kube-api-access-4gm87" (OuterVolumeSpecName: "kube-api-access-4gm87") pod "3bd23bf8-835b-43be-8a06-19be86eb2609" (UID: "3bd23bf8-835b-43be-8a06-19be86eb2609"). InnerVolumeSpecName "kube-api-access-4gm87". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.462992 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bd23bf8-835b-43be-8a06-19be86eb2609" (UID: "3bd23bf8-835b-43be-8a06-19be86eb2609"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.470845 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gm87\" (UniqueName: \"kubernetes.io/projected/3bd23bf8-835b-43be-8a06-19be86eb2609-kube-api-access-4gm87\") on node \"crc\" DevicePath \"\"" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.470881 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd23bf8-835b-43be-8a06-19be86eb2609-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.840955 4763 generic.go:334] "Generic (PLEG): container finished" podID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerID="7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e" exitCode=0 Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.841010 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56r4p" event={"ID":"3bd23bf8-835b-43be-8a06-19be86eb2609","Type":"ContainerDied","Data":"7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e"} Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.841050 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56r4p" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.841074 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56r4p" event={"ID":"3bd23bf8-835b-43be-8a06-19be86eb2609","Type":"ContainerDied","Data":"515342f82d28f17a665f5d11e11b96bce9e42ae8743cbaaa5348a6024f4b47e0"} Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.841098 4763 scope.go:117] "RemoveContainer" containerID="7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.871286 4763 scope.go:117] "RemoveContainer" containerID="971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.883748 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56r4p"] Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.890195 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-56r4p"] Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.906147 4763 scope.go:117] "RemoveContainer" containerID="6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.929440 4763 scope.go:117] "RemoveContainer" containerID="7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e" Sep 30 14:50:31 crc kubenswrapper[4763]: E0930 14:50:31.930029 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e\": container with ID starting with 7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e not found: ID does not exist" containerID="7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.930109 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e"} err="failed to get container status \"7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e\": rpc error: code = NotFound desc = could not find container \"7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e\": container with ID starting with 7e400a6bccf6160dc2d8a74f72dbcac24713f251b0fddb7dbdbfffdf4461348e not found: ID does not exist" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.930143 4763 scope.go:117] "RemoveContainer" containerID="971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21" Sep 30 14:50:31 crc kubenswrapper[4763]: E0930 14:50:31.930557 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21\": container with ID starting with 971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21 not found: ID does not exist" containerID="971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.930679 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21"} err="failed to get container status \"971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21\": rpc error: code = NotFound desc = could not find container \"971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21\": container with ID starting with 971b89c65dc26033611ceef0f95b0e14953cf15296b6d2d9c99690a45c04da21 not found: ID does not exist" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.930719 4763 scope.go:117] "RemoveContainer" containerID="6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55" Sep 30 14:50:31 crc kubenswrapper[4763]: E0930 14:50:31.931056 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55\": container with ID starting with 6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55 not found: ID does not exist" containerID="6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55" Sep 30 14:50:31 crc kubenswrapper[4763]: I0930 14:50:31.931086 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55"} err="failed to get container status \"6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55\": rpc error: code = NotFound desc = could not find container \"6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55\": container with ID starting with 6d6d179fbace6f0be27917a49e64228c58427f382aaaa1aafc5cbf713ae73e55 not found: ID does not exist" Sep 30 14:50:32 crc kubenswrapper[4763]: I0930 14:50:32.514614 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd23bf8-835b-43be-8a06-19be86eb2609" path="/var/lib/kubelet/pods/3bd23bf8-835b-43be-8a06-19be86eb2609/volumes" Sep 30 14:50:36 crc kubenswrapper[4763]: I0930 14:50:36.490013 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:50:36 crc kubenswrapper[4763]: E0930 14:50:36.490592 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:50:47 crc kubenswrapper[4763]: I0930 14:50:47.489578 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:50:47 crc kubenswrapper[4763]: E0930 14:50:47.490458 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:50:59 crc kubenswrapper[4763]: I0930 14:50:59.489241 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:50:59 crc kubenswrapper[4763]: E0930 14:50:59.490165 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:51:10 crc kubenswrapper[4763]: I0930 14:51:10.489009 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:51:11 crc kubenswrapper[4763]: I0930 14:51:11.133965 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"4da597f3a8b25df538033d6e0e1f219426e990b064711c97664abf3092e45110"} Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.466630 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-dc7q7"] Sep 30 14:52:47 crc kubenswrapper[4763]: E0930 14:52:47.467504 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerName="registry-server" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.467519 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerName="registry-server" Sep 30 14:52:47 crc kubenswrapper[4763]: E0930 14:52:47.467536 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerName="extract-content" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.467543 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerName="extract-content" Sep 30 14:52:47 crc kubenswrapper[4763]: E0930 14:52:47.467567 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerName="extract-utilities" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.467575 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerName="extract-utilities" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.468263 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd23bf8-835b-43be-8a06-19be86eb2609" containerName="registry-server" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.469195 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.473523 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.473853 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.473974 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.474090 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lx62t" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.474238 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.485572 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-dc7q7"] Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.578053 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-config\") pod \"dnsmasq-dns-bc7bd85-dc7q7\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.578141 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-dns-svc\") pod \"dnsmasq-dns-bc7bd85-dc7q7\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.578206 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krgpv\" (UniqueName: \"kubernetes.io/projected/fba032d8-7e42-407e-91a5-edcbe0fb3491-kube-api-access-krgpv\") pod \"dnsmasq-dns-bc7bd85-dc7q7\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.679439 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-config\") pod \"dnsmasq-dns-bc7bd85-dc7q7\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.679552 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-dns-svc\") pod \"dnsmasq-dns-bc7bd85-dc7q7\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.679637 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krgpv\" (UniqueName: \"kubernetes.io/projected/fba032d8-7e42-407e-91a5-edcbe0fb3491-kube-api-access-krgpv\") pod \"dnsmasq-dns-bc7bd85-dc7q7\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.680492 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-config\") pod \"dnsmasq-dns-bc7bd85-dc7q7\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.681078 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-dns-svc\") pod \"dnsmasq-dns-bc7bd85-dc7q7\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.705579 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krgpv\" (UniqueName: \"kubernetes.io/projected/fba032d8-7e42-407e-91a5-edcbe0fb3491-kube-api-access-krgpv\") pod \"dnsmasq-dns-bc7bd85-dc7q7\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.759317 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-shfzc"] Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.760535 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.772098 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-shfzc"] Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.780313 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-config\") pod \"dnsmasq-dns-5f455d6d69-shfzc\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.780360 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-dns-svc\") pod \"dnsmasq-dns-5f455d6d69-shfzc\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.780568 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jjfz\" (UniqueName: \"kubernetes.io/projected/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-kube-api-access-2jjfz\") pod \"dnsmasq-dns-5f455d6d69-shfzc\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.788383 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.884249 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-config\") pod \"dnsmasq-dns-5f455d6d69-shfzc\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.884589 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-dns-svc\") pod \"dnsmasq-dns-5f455d6d69-shfzc\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.884734 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jjfz\" (UniqueName: \"kubernetes.io/projected/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-kube-api-access-2jjfz\") pod \"dnsmasq-dns-5f455d6d69-shfzc\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.885414 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-config\") pod \"dnsmasq-dns-5f455d6d69-shfzc\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.885453 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-dns-svc\") pod \"dnsmasq-dns-5f455d6d69-shfzc\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:47 crc kubenswrapper[4763]: I0930 14:52:47.958503 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jjfz\" (UniqueName: \"kubernetes.io/projected/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-kube-api-access-2jjfz\") pod \"dnsmasq-dns-5f455d6d69-shfzc\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.078483 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.303989 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-dc7q7"] Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.567024 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-shfzc"] Sep 30 14:52:48 crc kubenswrapper[4763]: W0930 14:52:48.571643 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5427eafc_b753_4e2f_92bc_b0f4ccbd357c.slice/crio-8d966481e65e92d8e4223b20d842cbdc3dd00f46d2a5db4ad8d52a650d2e6540 WatchSource:0}: Error finding container 8d966481e65e92d8e4223b20d842cbdc3dd00f46d2a5db4ad8d52a650d2e6540: Status 404 returned error can't find the container with id 8d966481e65e92d8e4223b20d842cbdc3dd00f46d2a5db4ad8d52a650d2e6540 Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.630048 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.631647 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.634831 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.634834 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mgrb6" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.634920 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.634968 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.635688 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.650344 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.698688 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.698987 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4972ddb3-00a4-458d-864f-bf101ef508c7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.699109 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.699246 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5phw9\" (UniqueName: \"kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-kube-api-access-5phw9\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.699371 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.700364 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4972ddb3-00a4-458d-864f-bf101ef508c7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.700462 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.700548 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.700700 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-09e64457-1467-4834-b64c-01f23c1cc33a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.736431 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d982r"] Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.738291 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.744412 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d982r"] Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802220 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-catalog-content\") pod \"redhat-marketplace-d982r\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802264 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-utilities\") pod \"redhat-marketplace-d982r\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802346 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802375 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4972ddb3-00a4-458d-864f-bf101ef508c7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802394 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802421 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7sts\" (UniqueName: \"kubernetes.io/projected/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-kube-api-access-h7sts\") pod \"redhat-marketplace-d982r\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802443 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802459 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5phw9\" (UniqueName: \"kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-kube-api-access-5phw9\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802477 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4972ddb3-00a4-458d-864f-bf101ef508c7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802492 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802510 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.802536 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-09e64457-1467-4834-b64c-01f23c1cc33a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.803189 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.803458 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.804321 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.805423 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.806539 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4972ddb3-00a4-458d-864f-bf101ef508c7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.806582 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4972ddb3-00a4-458d-864f-bf101ef508c7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.807857 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.807883 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-09e64457-1467-4834-b64c-01f23c1cc33a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9cce1aac6127f0ea8a55e8cb08c10c0074ed4c07f0c00b05440c84be8d071db5/globalmount\"" pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.810328 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.830108 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5phw9\" (UniqueName: \"kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-kube-api-access-5phw9\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.844242 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-09e64457-1467-4834-b64c-01f23c1cc33a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") pod \"rabbitmq-server-0\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.903352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-catalog-content\") pod \"redhat-marketplace-d982r\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.903404 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-utilities\") pod \"redhat-marketplace-d982r\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.903510 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7sts\" (UniqueName: \"kubernetes.io/projected/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-kube-api-access-h7sts\") pod \"redhat-marketplace-d982r\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.904447 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-catalog-content\") pod \"redhat-marketplace-d982r\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.904786 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-utilities\") pod \"redhat-marketplace-d982r\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.905403 4763 generic.go:334] "Generic (PLEG): container finished" podID="fba032d8-7e42-407e-91a5-edcbe0fb3491" containerID="dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf" exitCode=0 Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.905478 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" event={"ID":"fba032d8-7e42-407e-91a5-edcbe0fb3491","Type":"ContainerDied","Data":"dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf"} Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.905508 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" event={"ID":"fba032d8-7e42-407e-91a5-edcbe0fb3491","Type":"ContainerStarted","Data":"96eda918f2916c3d468248176660dcf03682d4eab9b1f50e997410518472fe89"} Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.908278 4763 generic.go:334] "Generic (PLEG): container finished" podID="5427eafc-b753-4e2f-92bc-b0f4ccbd357c" containerID="bc464c583bead03d4da0cbd5c433c1345a7342a20924e79d526fb34e54dd5ee8" exitCode=0 Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.908308 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" event={"ID":"5427eafc-b753-4e2f-92bc-b0f4ccbd357c","Type":"ContainerDied","Data":"bc464c583bead03d4da0cbd5c433c1345a7342a20924e79d526fb34e54dd5ee8"} Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.908323 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" event={"ID":"5427eafc-b753-4e2f-92bc-b0f4ccbd357c","Type":"ContainerStarted","Data":"8d966481e65e92d8e4223b20d842cbdc3dd00f46d2a5db4ad8d52a650d2e6540"} Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.933536 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7sts\" (UniqueName: \"kubernetes.io/projected/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-kube-api-access-h7sts\") pod \"redhat-marketplace-d982r\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.964980 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.971303 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.985967 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.988178 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.988228 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.990846 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-bfg25" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.991788 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 14:52:48 crc kubenswrapper[4763]: I0930 14:52:48.992058 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.002261 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.018506 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.024176 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.028089 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.028879 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-5kxbk" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.034397 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.073622 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.106904 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzrrn\" (UniqueName: \"kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-kube-api-access-hzrrn\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.106977 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.107000 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/182ff55d-2f19-4e59-a425-583a949dad4c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.107039 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.107066 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.107088 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.107177 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/182ff55d-2f19-4e59-a425-583a949dad4c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.107234 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.107296 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: E0930 14:52:49.201321 4763 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Sep 30 14:52:49 crc kubenswrapper[4763]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/fba032d8-7e42-407e-91a5-edcbe0fb3491/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 30 14:52:49 crc kubenswrapper[4763]: > podSandboxID="96eda918f2916c3d468248176660dcf03682d4eab9b1f50e997410518472fe89" Sep 30 14:52:49 crc kubenswrapper[4763]: E0930 14:52:49.201499 4763 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 30 14:52:49 crc kubenswrapper[4763]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-krgpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bc7bd85-dc7q7_openstack(fba032d8-7e42-407e-91a5-edcbe0fb3491): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/fba032d8-7e42-407e-91a5-edcbe0fb3491/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 30 14:52:49 crc kubenswrapper[4763]: > logger="UnhandledError" Sep 30 14:52:49 crc kubenswrapper[4763]: E0930 14:52:49.204993 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/fba032d8-7e42-407e-91a5-edcbe0fb3491/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" podUID="fba032d8-7e42-407e-91a5-edcbe0fb3491" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208172 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/182ff55d-2f19-4e59-a425-583a949dad4c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208217 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208240 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208277 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzrrn\" (UniqueName: \"kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-kube-api-access-hzrrn\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66aa844f-2e00-40de-9084-1f68e6742ab1-config-data\") pod \"memcached-0\" (UID: \"66aa844f-2e00-40de-9084-1f68e6742ab1\") " pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208316 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmrr9\" (UniqueName: \"kubernetes.io/projected/66aa844f-2e00-40de-9084-1f68e6742ab1-kube-api-access-zmrr9\") pod \"memcached-0\" (UID: \"66aa844f-2e00-40de-9084-1f68e6742ab1\") " pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208335 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208350 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/182ff55d-2f19-4e59-a425-583a949dad4c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208374 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208393 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.208432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66aa844f-2e00-40de-9084-1f68e6742ab1-kolla-config\") pod \"memcached-0\" (UID: \"66aa844f-2e00-40de-9084-1f68e6742ab1\") " pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.210185 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.210519 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.211375 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.213822 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.214589 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.216282 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/182ff55d-2f19-4e59-a425-583a949dad4c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.217982 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.224114 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.224184 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/182ff55d-2f19-4e59-a425-583a949dad4c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.224300 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-f26gl" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.224617 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.224637 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.224659 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/adddfef9fb30291f48b6bccb14fbabcdc963ca594408526e8a49d5a40750beac/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.228367 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.228526 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.228732 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.234136 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.237912 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.258933 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzrrn\" (UniqueName: \"kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-kube-api-access-hzrrn\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.306900 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") pod \"rabbitmq-cell1-server-0\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.309415 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63a8ad25-874a-4688-a6cb-abfac16910a3-config-data-default\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.309471 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a8ad25-874a-4688-a6cb-abfac16910a3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.309509 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63a8ad25-874a-4688-a6cb-abfac16910a3-kolla-config\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.309723 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nbd8\" (UniqueName: \"kubernetes.io/projected/63a8ad25-874a-4688-a6cb-abfac16910a3-kube-api-access-6nbd8\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.309776 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a8db5892-6ee6-4fc4-b3c7-c0d92edec416\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8db5892-6ee6-4fc4-b3c7-c0d92edec416\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.309863 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63a8ad25-874a-4688-a6cb-abfac16910a3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.309985 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/63a8ad25-874a-4688-a6cb-abfac16910a3-secrets\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.310025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66aa844f-2e00-40de-9084-1f68e6742ab1-config-data\") pod \"memcached-0\" (UID: \"66aa844f-2e00-40de-9084-1f68e6742ab1\") " pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.310057 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmrr9\" (UniqueName: \"kubernetes.io/projected/66aa844f-2e00-40de-9084-1f68e6742ab1-kube-api-access-zmrr9\") pod \"memcached-0\" (UID: \"66aa844f-2e00-40de-9084-1f68e6742ab1\") " pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.310094 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63a8ad25-874a-4688-a6cb-abfac16910a3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.310117 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a8ad25-874a-4688-a6cb-abfac16910a3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.310195 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66aa844f-2e00-40de-9084-1f68e6742ab1-kolla-config\") pod \"memcached-0\" (UID: \"66aa844f-2e00-40de-9084-1f68e6742ab1\") " pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.311139 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66aa844f-2e00-40de-9084-1f68e6742ab1-kolla-config\") pod \"memcached-0\" (UID: \"66aa844f-2e00-40de-9084-1f68e6742ab1\") " pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.313469 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66aa844f-2e00-40de-9084-1f68e6742ab1-config-data\") pod \"memcached-0\" (UID: \"66aa844f-2e00-40de-9084-1f68e6742ab1\") " pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.338394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmrr9\" (UniqueName: \"kubernetes.io/projected/66aa844f-2e00-40de-9084-1f68e6742ab1-kube-api-access-zmrr9\") pod \"memcached-0\" (UID: \"66aa844f-2e00-40de-9084-1f68e6742ab1\") " pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.341188 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:52:49 crc kubenswrapper[4763]: W0930 14:52:49.346418 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4972ddb3_00a4_458d_864f_bf101ef508c7.slice/crio-f3fd30e7384a5f9b6992b3cf566b6cc686640a65f727da3bb1fcb3d765672dfc WatchSource:0}: Error finding container f3fd30e7384a5f9b6992b3cf566b6cc686640a65f727da3bb1fcb3d765672dfc: Status 404 returned error can't find the container with id f3fd30e7384a5f9b6992b3cf566b6cc686640a65f727da3bb1fcb3d765672dfc Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.385244 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.410185 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d982r"] Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.411007 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63a8ad25-874a-4688-a6cb-abfac16910a3-kolla-config\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.411064 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nbd8\" (UniqueName: \"kubernetes.io/projected/63a8ad25-874a-4688-a6cb-abfac16910a3-kube-api-access-6nbd8\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.411088 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a8db5892-6ee6-4fc4-b3c7-c0d92edec416\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8db5892-6ee6-4fc4-b3c7-c0d92edec416\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.411125 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63a8ad25-874a-4688-a6cb-abfac16910a3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.411186 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/63a8ad25-874a-4688-a6cb-abfac16910a3-secrets\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.411232 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63a8ad25-874a-4688-a6cb-abfac16910a3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.411256 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a8ad25-874a-4688-a6cb-abfac16910a3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.411314 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63a8ad25-874a-4688-a6cb-abfac16910a3-config-data-default\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.411343 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a8ad25-874a-4688-a6cb-abfac16910a3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.411377 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.413053 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63a8ad25-874a-4688-a6cb-abfac16910a3-kolla-config\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.413200 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63a8ad25-874a-4688-a6cb-abfac16910a3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.413698 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63a8ad25-874a-4688-a6cb-abfac16910a3-config-data-default\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.414439 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.414566 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a8db5892-6ee6-4fc4-b3c7-c0d92edec416\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8db5892-6ee6-4fc4-b3c7-c0d92edec416\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d9f75969eacc643647f3a001729d19386f34034ba1ac42e460b4fd2739555d9f/globalmount\"" pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.414753 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a8ad25-874a-4688-a6cb-abfac16910a3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.414522 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/63a8ad25-874a-4688-a6cb-abfac16910a3-secrets\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.414826 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a8ad25-874a-4688-a6cb-abfac16910a3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.416072 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63a8ad25-874a-4688-a6cb-abfac16910a3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: W0930 14:52:49.436042 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1dac0e_c2b8_4ab0_a28d_7a8bbb5f6819.slice/crio-2518a9be06ba138923aa2698f67dba24cf8512f5e71d50f25a6b12fc09888b3d WatchSource:0}: Error finding container 2518a9be06ba138923aa2698f67dba24cf8512f5e71d50f25a6b12fc09888b3d: Status 404 returned error can't find the container with id 2518a9be06ba138923aa2698f67dba24cf8512f5e71d50f25a6b12fc09888b3d Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.437158 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nbd8\" (UniqueName: \"kubernetes.io/projected/63a8ad25-874a-4688-a6cb-abfac16910a3-kube-api-access-6nbd8\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.451417 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a8db5892-6ee6-4fc4-b3c7-c0d92edec416\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8db5892-6ee6-4fc4-b3c7-c0d92edec416\") pod \"openstack-galera-0\" (UID: \"63a8ad25-874a-4688-a6cb-abfac16910a3\") " pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.541506 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.662703 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.916717 4763 generic.go:334] "Generic (PLEG): container finished" podID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerID="daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35" exitCode=0 Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.916774 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d982r" event={"ID":"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819","Type":"ContainerDied","Data":"daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35"} Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.917016 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d982r" event={"ID":"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819","Type":"ContainerStarted","Data":"2518a9be06ba138923aa2698f67dba24cf8512f5e71d50f25a6b12fc09888b3d"} Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.924881 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.926490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" event={"ID":"5427eafc-b753-4e2f-92bc-b0f4ccbd357c","Type":"ContainerStarted","Data":"9179905f9defb87b3a1a00342be6532d21f5372388d383b53351c94998a1ab0f"} Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.926688 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.928658 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"182ff55d-2f19-4e59-a425-583a949dad4c","Type":"ContainerStarted","Data":"684186d1d67cdc1fae90f891505bc525bddbe08704a278d92f5ab3667b26ee04"} Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.930845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4972ddb3-00a4-458d-864f-bf101ef508c7","Type":"ContainerStarted","Data":"6bb3b35d0146b678be3d51ab5961ae06d78abeceaf6c7516fb30516d6b4ab030"} Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.930905 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4972ddb3-00a4-458d-864f-bf101ef508c7","Type":"ContainerStarted","Data":"f3fd30e7384a5f9b6992b3cf566b6cc686640a65f727da3bb1fcb3d765672dfc"} Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.948480 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 14:52:49 crc kubenswrapper[4763]: I0930 14:52:49.953091 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" podStartSLOduration=2.953071222 podStartE2EDuration="2.953071222s" podCreationTimestamp="2025-09-30 14:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:52:49.945849732 +0000 UTC m=+4642.084410037" watchObservedRunningTime="2025-09-30 14:52:49.953071222 +0000 UTC m=+4642.091631507" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.027238 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.233942 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.235311 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.238434 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.238713 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-86z7s" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.238859 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.240377 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.246494 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.332849 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.332899 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3862e455-66ab-496f-92db-bf6cb6ef713e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3862e455-66ab-496f-92db-bf6cb6ef713e\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.332974 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65xhc\" (UniqueName: \"kubernetes.io/projected/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-kube-api-access-65xhc\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.333008 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.333036 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.333058 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.333080 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.333101 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.333119 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: W0930 14:52:50.394705 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66aa844f_2e00_40de_9084_1f68e6742ab1.slice/crio-9a27768f527459dbda2825ff44ff7bd523a74d5129ab0a871c75865af4d8a6be WatchSource:0}: Error finding container 9a27768f527459dbda2825ff44ff7bd523a74d5129ab0a871c75865af4d8a6be: Status 404 returned error can't find the container with id 9a27768f527459dbda2825ff44ff7bd523a74d5129ab0a871c75865af4d8a6be Sep 30 14:52:50 crc kubenswrapper[4763]: W0930 14:52:50.397776 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63a8ad25_874a_4688_a6cb_abfac16910a3.slice/crio-33768ca538ecc798a062a1b0139f2d89815dd2056f1240c1b9dc35d0e4df2c17 WatchSource:0}: Error finding container 33768ca538ecc798a062a1b0139f2d89815dd2056f1240c1b9dc35d0e4df2c17: Status 404 returned error can't find the container with id 33768ca538ecc798a062a1b0139f2d89815dd2056f1240c1b9dc35d0e4df2c17 Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.434114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.434154 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3862e455-66ab-496f-92db-bf6cb6ef713e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3862e455-66ab-496f-92db-bf6cb6ef713e\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.434210 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65xhc\" (UniqueName: \"kubernetes.io/projected/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-kube-api-access-65xhc\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.434250 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.434271 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.434294 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.434316 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.434336 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.434353 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.434863 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.435366 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.436088 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.437294 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.438013 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.441470 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.441567 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3862e455-66ab-496f-92db-bf6cb6ef713e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3862e455-66ab-496f-92db-bf6cb6ef713e\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/29d149739f481c60d340c916a5dcd2390b7ff6dc9a0d84167be7e8b2fdba7414/globalmount\"" pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.491187 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65xhc\" (UniqueName: \"kubernetes.io/projected/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-kube-api-access-65xhc\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.493225 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.493765 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.521529 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3862e455-66ab-496f-92db-bf6cb6ef713e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3862e455-66ab-496f-92db-bf6cb6ef713e\") pod \"openstack-cell1-galera-0\" (UID: \"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.793337 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.944003 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"66aa844f-2e00-40de-9084-1f68e6742ab1","Type":"ContainerStarted","Data":"975ab4dbe81b511ce9c2b873b918ba9123b175b94d0da3374e933cf76317fb53"} Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.944314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"66aa844f-2e00-40de-9084-1f68e6742ab1","Type":"ContainerStarted","Data":"9a27768f527459dbda2825ff44ff7bd523a74d5129ab0a871c75865af4d8a6be"} Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.944492 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.946324 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" event={"ID":"fba032d8-7e42-407e-91a5-edcbe0fb3491","Type":"ContainerStarted","Data":"93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59"} Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.946510 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.951715 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d982r" event={"ID":"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819","Type":"ContainerStarted","Data":"4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152"} Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.956417 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"182ff55d-2f19-4e59-a425-583a949dad4c","Type":"ContainerStarted","Data":"ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b"} Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.967014 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.966993827 podStartE2EDuration="2.966993827s" podCreationTimestamp="2025-09-30 14:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:52:50.96469346 +0000 UTC m=+4643.103253755" watchObservedRunningTime="2025-09-30 14:52:50.966993827 +0000 UTC m=+4643.105554112" Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.967402 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"63a8ad25-874a-4688-a6cb-abfac16910a3","Type":"ContainerStarted","Data":"7c07fc90fae1c6f484ff3f028d441504598cea38cac8a0956c22c1d1af07a70f"} Sep 30 14:52:50 crc kubenswrapper[4763]: I0930 14:52:50.967461 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"63a8ad25-874a-4688-a6cb-abfac16910a3","Type":"ContainerStarted","Data":"33768ca538ecc798a062a1b0139f2d89815dd2056f1240c1b9dc35d0e4df2c17"} Sep 30 14:52:51 crc kubenswrapper[4763]: I0930 14:52:51.000437 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" podStartSLOduration=4.0004151 podStartE2EDuration="4.0004151s" podCreationTimestamp="2025-09-30 14:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:52:50.996097753 +0000 UTC m=+4643.134658038" watchObservedRunningTime="2025-09-30 14:52:51.0004151 +0000 UTC m=+4643.138975385" Sep 30 14:52:51 crc kubenswrapper[4763]: I0930 14:52:51.250704 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 14:52:51 crc kubenswrapper[4763]: W0930 14:52:51.259274 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dcd4ce_0d5a_4e4d_8ccd_dc604f22c561.slice/crio-c4b3797feb393e35f971227b9e177adda008621f5682a74ca93a10a376282e83 WatchSource:0}: Error finding container c4b3797feb393e35f971227b9e177adda008621f5682a74ca93a10a376282e83: Status 404 returned error can't find the container with id c4b3797feb393e35f971227b9e177adda008621f5682a74ca93a10a376282e83 Sep 30 14:52:51 crc kubenswrapper[4763]: I0930 14:52:51.973909 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561","Type":"ContainerStarted","Data":"e929f88adf54e78392167e63ef08cf47f73a23bc7675383f5d4ca1e0f3b635f1"} Sep 30 14:52:51 crc kubenswrapper[4763]: I0930 14:52:51.974262 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561","Type":"ContainerStarted","Data":"c4b3797feb393e35f971227b9e177adda008621f5682a74ca93a10a376282e83"} Sep 30 14:52:51 crc kubenswrapper[4763]: I0930 14:52:51.975839 4763 generic.go:334] "Generic (PLEG): container finished" podID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerID="4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152" exitCode=0 Sep 30 14:52:51 crc kubenswrapper[4763]: I0930 14:52:51.975960 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d982r" event={"ID":"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819","Type":"ContainerDied","Data":"4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152"} Sep 30 14:52:52 crc kubenswrapper[4763]: I0930 14:52:52.984122 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d982r" event={"ID":"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819","Type":"ContainerStarted","Data":"f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080"} Sep 30 14:52:53 crc kubenswrapper[4763]: I0930 14:52:53.002812 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d982r" podStartSLOduration=2.568770496 podStartE2EDuration="5.002793385s" podCreationTimestamp="2025-09-30 14:52:48 +0000 UTC" firstStartedPulling="2025-09-30 14:52:49.921164757 +0000 UTC m=+4642.059725042" lastFinishedPulling="2025-09-30 14:52:52.355187646 +0000 UTC m=+4644.493747931" observedRunningTime="2025-09-30 14:52:53.000624851 +0000 UTC m=+4645.139185146" watchObservedRunningTime="2025-09-30 14:52:53.002793385 +0000 UTC m=+4645.141353670" Sep 30 14:52:55 crc kubenswrapper[4763]: I0930 14:52:55.009399 4763 generic.go:334] "Generic (PLEG): container finished" podID="46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561" containerID="e929f88adf54e78392167e63ef08cf47f73a23bc7675383f5d4ca1e0f3b635f1" exitCode=0 Sep 30 14:52:55 crc kubenswrapper[4763]: I0930 14:52:55.009502 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561","Type":"ContainerDied","Data":"e929f88adf54e78392167e63ef08cf47f73a23bc7675383f5d4ca1e0f3b635f1"} Sep 30 14:52:55 crc kubenswrapper[4763]: I0930 14:52:55.012505 4763 generic.go:334] "Generic (PLEG): container finished" podID="63a8ad25-874a-4688-a6cb-abfac16910a3" containerID="7c07fc90fae1c6f484ff3f028d441504598cea38cac8a0956c22c1d1af07a70f" exitCode=0 Sep 30 14:52:55 crc kubenswrapper[4763]: I0930 14:52:55.012650 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"63a8ad25-874a-4688-a6cb-abfac16910a3","Type":"ContainerDied","Data":"7c07fc90fae1c6f484ff3f028d441504598cea38cac8a0956c22c1d1af07a70f"} Sep 30 14:52:56 crc kubenswrapper[4763]: I0930 14:52:56.023512 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561","Type":"ContainerStarted","Data":"f2a8558f36c85f73ff27d0d25ac6e6332fdb94c06472135d5e2ee0d91eecd978"} Sep 30 14:52:56 crc kubenswrapper[4763]: I0930 14:52:56.027067 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"63a8ad25-874a-4688-a6cb-abfac16910a3","Type":"ContainerStarted","Data":"88556ddc76b85480923c280ecc327e2dac891d0966b45cf009aef5fbe1567c07"} Sep 30 14:52:56 crc kubenswrapper[4763]: I0930 14:52:56.050734 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.050711845 podStartE2EDuration="7.050711845s" podCreationTimestamp="2025-09-30 14:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:52:56.046276234 +0000 UTC m=+4648.184836529" watchObservedRunningTime="2025-09-30 14:52:56.050711845 +0000 UTC m=+4648.189272130" Sep 30 14:52:56 crc kubenswrapper[4763]: I0930 14:52:56.085675 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.085658186 podStartE2EDuration="8.085658186s" podCreationTimestamp="2025-09-30 14:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:52:56.082794275 +0000 UTC m=+4648.221354560" watchObservedRunningTime="2025-09-30 14:52:56.085658186 +0000 UTC m=+4648.224218481" Sep 30 14:52:57 crc kubenswrapper[4763]: I0930 14:52:57.790554 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.082983 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.126206 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-dc7q7"] Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.126451 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" podUID="fba032d8-7e42-407e-91a5-edcbe0fb3491" containerName="dnsmasq-dns" containerID="cri-o://93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59" gracePeriod=10 Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.518807 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.664221 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-dns-svc\") pod \"fba032d8-7e42-407e-91a5-edcbe0fb3491\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.664326 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-config\") pod \"fba032d8-7e42-407e-91a5-edcbe0fb3491\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.664385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krgpv\" (UniqueName: \"kubernetes.io/projected/fba032d8-7e42-407e-91a5-edcbe0fb3491-kube-api-access-krgpv\") pod \"fba032d8-7e42-407e-91a5-edcbe0fb3491\" (UID: \"fba032d8-7e42-407e-91a5-edcbe0fb3491\") " Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.669814 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fba032d8-7e42-407e-91a5-edcbe0fb3491-kube-api-access-krgpv" (OuterVolumeSpecName: "kube-api-access-krgpv") pod "fba032d8-7e42-407e-91a5-edcbe0fb3491" (UID: "fba032d8-7e42-407e-91a5-edcbe0fb3491"). InnerVolumeSpecName "kube-api-access-krgpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.708012 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fba032d8-7e42-407e-91a5-edcbe0fb3491" (UID: "fba032d8-7e42-407e-91a5-edcbe0fb3491"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.714643 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-config" (OuterVolumeSpecName: "config") pod "fba032d8-7e42-407e-91a5-edcbe0fb3491" (UID: "fba032d8-7e42-407e-91a5-edcbe0fb3491"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.766239 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.766276 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krgpv\" (UniqueName: \"kubernetes.io/projected/fba032d8-7e42-407e-91a5-edcbe0fb3491-kube-api-access-krgpv\") on node \"crc\" DevicePath \"\"" Sep 30 14:52:58 crc kubenswrapper[4763]: I0930 14:52:58.766286 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fba032d8-7e42-407e-91a5-edcbe0fb3491-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.050753 4763 generic.go:334] "Generic (PLEG): container finished" podID="fba032d8-7e42-407e-91a5-edcbe0fb3491" containerID="93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59" exitCode=0 Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.050804 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" event={"ID":"fba032d8-7e42-407e-91a5-edcbe0fb3491","Type":"ContainerDied","Data":"93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59"} Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.050834 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" event={"ID":"fba032d8-7e42-407e-91a5-edcbe0fb3491","Type":"ContainerDied","Data":"96eda918f2916c3d468248176660dcf03682d4eab9b1f50e997410518472fe89"} Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.050856 4763 scope.go:117] "RemoveContainer" containerID="93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.051011 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7bd85-dc7q7" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.075487 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.075869 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.075965 4763 scope.go:117] "RemoveContainer" containerID="dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.092363 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-dc7q7"] Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.097875 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-dc7q7"] Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.109990 4763 scope.go:117] "RemoveContainer" containerID="93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59" Sep 30 14:52:59 crc kubenswrapper[4763]: E0930 14:52:59.110607 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59\": container with ID starting with 93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59 not found: ID does not exist" containerID="93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.110694 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59"} err="failed to get container status \"93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59\": rpc error: code = NotFound desc = could not find container \"93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59\": container with ID starting with 93cf8520745cc21522a412191a65e0a216fc52221d627067ee6f697c9b7caa59 not found: ID does not exist" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.110738 4763 scope.go:117] "RemoveContainer" containerID="dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf" Sep 30 14:52:59 crc kubenswrapper[4763]: E0930 14:52:59.111472 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf\": container with ID starting with dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf not found: ID does not exist" containerID="dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.111512 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf"} err="failed to get container status \"dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf\": rpc error: code = NotFound desc = could not find container \"dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf\": container with ID starting with dd198006af934ddb40c8acc914254457d06cf6ae73b54de3bd6f946932a87aaf not found: ID does not exist" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.122463 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.412287 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.541814 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.542074 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 14:52:59 crc kubenswrapper[4763]: I0930 14:52:59.591024 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 14:53:00 crc kubenswrapper[4763]: I0930 14:53:00.340690 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 14:53:00 crc kubenswrapper[4763]: I0930 14:53:00.498510 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fba032d8-7e42-407e-91a5-edcbe0fb3491" path="/var/lib/kubelet/pods/fba032d8-7e42-407e-91a5-edcbe0fb3491/volumes" Sep 30 14:53:00 crc kubenswrapper[4763]: I0930 14:53:00.524324 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:53:00 crc kubenswrapper[4763]: I0930 14:53:00.574403 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d982r"] Sep 30 14:53:00 crc kubenswrapper[4763]: I0930 14:53:00.794264 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 14:53:00 crc kubenswrapper[4763]: I0930 14:53:00.794321 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.073000 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d982r" podUID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerName="registry-server" containerID="cri-o://f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080" gracePeriod=2 Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.472843 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.625295 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7sts\" (UniqueName: \"kubernetes.io/projected/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-kube-api-access-h7sts\") pod \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.625699 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-utilities\") pod \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.625890 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-catalog-content\") pod \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\" (UID: \"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819\") " Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.626711 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-utilities" (OuterVolumeSpecName: "utilities") pod "fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" (UID: "fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.634070 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-kube-api-access-h7sts" (OuterVolumeSpecName: "kube-api-access-h7sts") pod "fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" (UID: "fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819"). InnerVolumeSpecName "kube-api-access-h7sts". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.639308 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" (UID: "fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.727472 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.727505 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7sts\" (UniqueName: \"kubernetes.io/projected/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-kube-api-access-h7sts\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.727516 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.854554 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 14:53:02 crc kubenswrapper[4763]: I0930 14:53:02.900937 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.080326 4763 generic.go:334] "Generic (PLEG): container finished" podID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerID="f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080" exitCode=0 Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.081773 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d982r" Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.085092 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d982r" event={"ID":"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819","Type":"ContainerDied","Data":"f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080"} Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.085137 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d982r" event={"ID":"fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819","Type":"ContainerDied","Data":"2518a9be06ba138923aa2698f67dba24cf8512f5e71d50f25a6b12fc09888b3d"} Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.085155 4763 scope.go:117] "RemoveContainer" containerID="f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080" Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.101687 4763 scope.go:117] "RemoveContainer" containerID="4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152" Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.120627 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d982r"] Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.125475 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d982r"] Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.128505 4763 scope.go:117] "RemoveContainer" containerID="daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35" Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.155259 4763 scope.go:117] "RemoveContainer" containerID="f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080" Sep 30 14:53:03 crc kubenswrapper[4763]: E0930 14:53:03.156147 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080\": container with ID starting with f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080 not found: ID does not exist" containerID="f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080" Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.156212 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080"} err="failed to get container status \"f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080\": rpc error: code = NotFound desc = could not find container \"f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080\": container with ID starting with f75e4a090426a06ab8a243bf5ec62e5a5d1b3f0bbc618f8c931b58193ec81080 not found: ID does not exist" Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.156239 4763 scope.go:117] "RemoveContainer" containerID="4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152" Sep 30 14:53:03 crc kubenswrapper[4763]: E0930 14:53:03.156657 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152\": container with ID starting with 4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152 not found: ID does not exist" containerID="4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152" Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.156763 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152"} err="failed to get container status \"4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152\": rpc error: code = NotFound desc = could not find container \"4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152\": container with ID starting with 4bf1a622a10b99665fe2004352c9370eec50ce479400b097bbabdfcb70775152 not found: ID does not exist" Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.156867 4763 scope.go:117] "RemoveContainer" containerID="daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35" Sep 30 14:53:03 crc kubenswrapper[4763]: E0930 14:53:03.157296 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35\": container with ID starting with daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35 not found: ID does not exist" containerID="daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35" Sep 30 14:53:03 crc kubenswrapper[4763]: I0930 14:53:03.157361 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35"} err="failed to get container status \"daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35\": rpc error: code = NotFound desc = could not find container \"daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35\": container with ID starting with daf7a8644ec47a3eadfa14cee06f9ed842f9cc8f02407fa4f49f5ae3463d1c35 not found: ID does not exist" Sep 30 14:53:04 crc kubenswrapper[4763]: I0930 14:53:04.501695 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" path="/var/lib/kubelet/pods/fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819/volumes" Sep 30 14:53:21 crc kubenswrapper[4763]: I0930 14:53:21.242235 4763 generic.go:334] "Generic (PLEG): container finished" podID="4972ddb3-00a4-458d-864f-bf101ef508c7" containerID="6bb3b35d0146b678be3d51ab5961ae06d78abeceaf6c7516fb30516d6b4ab030" exitCode=0 Sep 30 14:53:21 crc kubenswrapper[4763]: I0930 14:53:21.242296 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4972ddb3-00a4-458d-864f-bf101ef508c7","Type":"ContainerDied","Data":"6bb3b35d0146b678be3d51ab5961ae06d78abeceaf6c7516fb30516d6b4ab030"} Sep 30 14:53:21 crc kubenswrapper[4763]: I0930 14:53:21.245615 4763 generic.go:334] "Generic (PLEG): container finished" podID="182ff55d-2f19-4e59-a425-583a949dad4c" containerID="ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b" exitCode=0 Sep 30 14:53:21 crc kubenswrapper[4763]: I0930 14:53:21.245663 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"182ff55d-2f19-4e59-a425-583a949dad4c","Type":"ContainerDied","Data":"ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b"} Sep 30 14:53:22 crc kubenswrapper[4763]: I0930 14:53:22.255548 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"182ff55d-2f19-4e59-a425-583a949dad4c","Type":"ContainerStarted","Data":"e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9"} Sep 30 14:53:22 crc kubenswrapper[4763]: I0930 14:53:22.258163 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:22 crc kubenswrapper[4763]: I0930 14:53:22.260452 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4972ddb3-00a4-458d-864f-bf101ef508c7","Type":"ContainerStarted","Data":"6b71ef21e730f12e278ba3a5faf9bb1e0144cbda5049b74d39b4237c30bd9885"} Sep 30 14:53:22 crc kubenswrapper[4763]: I0930 14:53:22.260771 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 14:53:22 crc kubenswrapper[4763]: I0930 14:53:22.280340 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.280317163 podStartE2EDuration="35.280317163s" podCreationTimestamp="2025-09-30 14:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:53:22.279673116 +0000 UTC m=+4674.418233401" watchObservedRunningTime="2025-09-30 14:53:22.280317163 +0000 UTC m=+4674.418877448" Sep 30 14:53:22 crc kubenswrapper[4763]: I0930 14:53:22.313309 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.313284804 podStartE2EDuration="35.313284804s" podCreationTimestamp="2025-09-30 14:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:53:22.312296251 +0000 UTC m=+4674.450856536" watchObservedRunningTime="2025-09-30 14:53:22.313284804 +0000 UTC m=+4674.451845099" Sep 30 14:53:36 crc kubenswrapper[4763]: I0930 14:53:36.059996 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:53:36 crc kubenswrapper[4763]: I0930 14:53:36.060565 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:53:38 crc kubenswrapper[4763]: I0930 14:53:38.968836 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 14:53:39 crc kubenswrapper[4763]: I0930 14:53:39.388819 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.456057 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-xfwmm"] Sep 30 14:53:41 crc kubenswrapper[4763]: E0930 14:53:41.456687 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba032d8-7e42-407e-91a5-edcbe0fb3491" containerName="dnsmasq-dns" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.456702 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba032d8-7e42-407e-91a5-edcbe0fb3491" containerName="dnsmasq-dns" Sep 30 14:53:41 crc kubenswrapper[4763]: E0930 14:53:41.456722 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerName="registry-server" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.456729 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerName="registry-server" Sep 30 14:53:41 crc kubenswrapper[4763]: E0930 14:53:41.456757 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerName="extract-content" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.456763 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerName="extract-content" Sep 30 14:53:41 crc kubenswrapper[4763]: E0930 14:53:41.456775 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba032d8-7e42-407e-91a5-edcbe0fb3491" containerName="init" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.456781 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba032d8-7e42-407e-91a5-edcbe0fb3491" containerName="init" Sep 30 14:53:41 crc kubenswrapper[4763]: E0930 14:53:41.456794 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerName="extract-utilities" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.456803 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerName="extract-utilities" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.456958 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1dac0e-c2b8-4ab0-a28d-7a8bbb5f6819" containerName="registry-server" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.456974 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba032d8-7e42-407e-91a5-edcbe0fb3491" containerName="dnsmasq-dns" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.458053 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.464032 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-xfwmm"] Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.558992 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-dns-svc\") pod \"dnsmasq-dns-6885566dd9-xfwmm\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.559109 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qv6z\" (UniqueName: \"kubernetes.io/projected/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-kube-api-access-9qv6z\") pod \"dnsmasq-dns-6885566dd9-xfwmm\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.559143 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-config\") pod \"dnsmasq-dns-6885566dd9-xfwmm\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.661045 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-dns-svc\") pod \"dnsmasq-dns-6885566dd9-xfwmm\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.661198 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qv6z\" (UniqueName: \"kubernetes.io/projected/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-kube-api-access-9qv6z\") pod \"dnsmasq-dns-6885566dd9-xfwmm\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.661233 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-config\") pod \"dnsmasq-dns-6885566dd9-xfwmm\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.661956 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-dns-svc\") pod \"dnsmasq-dns-6885566dd9-xfwmm\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.662387 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-config\") pod \"dnsmasq-dns-6885566dd9-xfwmm\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:41 crc kubenswrapper[4763]: I0930 14:53:41.897467 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qv6z\" (UniqueName: \"kubernetes.io/projected/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-kube-api-access-9qv6z\") pod \"dnsmasq-dns-6885566dd9-xfwmm\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:42 crc kubenswrapper[4763]: I0930 14:53:42.084746 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:42 crc kubenswrapper[4763]: I0930 14:53:42.102289 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:53:42 crc kubenswrapper[4763]: I0930 14:53:42.554030 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-xfwmm"] Sep 30 14:53:42 crc kubenswrapper[4763]: W0930 14:53:42.566122 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod490fc3cc_71b7_4a1b_9cc0_a98cbfb2e420.slice/crio-f069333388a71e84f21d931a2321e40e13b61f97b6ba0ad44f465a55506708af WatchSource:0}: Error finding container f069333388a71e84f21d931a2321e40e13b61f97b6ba0ad44f465a55506708af: Status 404 returned error can't find the container with id f069333388a71e84f21d931a2321e40e13b61f97b6ba0ad44f465a55506708af Sep 30 14:53:42 crc kubenswrapper[4763]: I0930 14:53:42.718015 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:53:43 crc kubenswrapper[4763]: I0930 14:53:43.434152 4763 generic.go:334] "Generic (PLEG): container finished" podID="490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" containerID="82949582a99bf7415dc35246acd3ddcde8be83f79dd91801eb7deb008a73d429" exitCode=0 Sep 30 14:53:43 crc kubenswrapper[4763]: I0930 14:53:43.434213 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" event={"ID":"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420","Type":"ContainerDied","Data":"82949582a99bf7415dc35246acd3ddcde8be83f79dd91801eb7deb008a73d429"} Sep 30 14:53:43 crc kubenswrapper[4763]: I0930 14:53:43.434242 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" event={"ID":"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420","Type":"ContainerStarted","Data":"f069333388a71e84f21d931a2321e40e13b61f97b6ba0ad44f465a55506708af"} Sep 30 14:53:43 crc kubenswrapper[4763]: I0930 14:53:43.995086 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="4972ddb3-00a4-458d-864f-bf101ef508c7" containerName="rabbitmq" containerID="cri-o://6b71ef21e730f12e278ba3a5faf9bb1e0144cbda5049b74d39b4237c30bd9885" gracePeriod=604799 Sep 30 14:53:44 crc kubenswrapper[4763]: I0930 14:53:44.442879 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" event={"ID":"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420","Type":"ContainerStarted","Data":"5f4145f40a965940038635da21258c9dc3fd4a1feb1bbc2d51d606ee776c5df4"} Sep 30 14:53:44 crc kubenswrapper[4763]: I0930 14:53:44.443136 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:44 crc kubenswrapper[4763]: I0930 14:53:44.458454 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" podStartSLOduration=3.458433279 podStartE2EDuration="3.458433279s" podCreationTimestamp="2025-09-30 14:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:53:44.4573144 +0000 UTC m=+4696.595874735" watchObservedRunningTime="2025-09-30 14:53:44.458433279 +0000 UTC m=+4696.596993584" Sep 30 14:53:44 crc kubenswrapper[4763]: I0930 14:53:44.608740 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="182ff55d-2f19-4e59-a425-583a949dad4c" containerName="rabbitmq" containerID="cri-o://e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9" gracePeriod=604799 Sep 30 14:53:48 crc kubenswrapper[4763]: I0930 14:53:48.966500 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4972ddb3-00a4-458d-864f-bf101ef508c7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.240:5672: connect: connection refused" Sep 30 14:53:49 crc kubenswrapper[4763]: I0930 14:53:49.386151 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="182ff55d-2f19-4e59-a425-583a949dad4c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.242:5672: connect: connection refused" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.487738 4763 generic.go:334] "Generic (PLEG): container finished" podID="4972ddb3-00a4-458d-864f-bf101ef508c7" containerID="6b71ef21e730f12e278ba3a5faf9bb1e0144cbda5049b74d39b4237c30bd9885" exitCode=0 Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.487819 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4972ddb3-00a4-458d-864f-bf101ef508c7","Type":"ContainerDied","Data":"6b71ef21e730f12e278ba3a5faf9bb1e0144cbda5049b74d39b4237c30bd9885"} Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.551046 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.632002 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") pod \"4972ddb3-00a4-458d-864f-bf101ef508c7\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.632086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-plugins-conf\") pod \"4972ddb3-00a4-458d-864f-bf101ef508c7\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.632169 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-confd\") pod \"4972ddb3-00a4-458d-864f-bf101ef508c7\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.632225 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4972ddb3-00a4-458d-864f-bf101ef508c7-pod-info\") pod \"4972ddb3-00a4-458d-864f-bf101ef508c7\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.632250 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-erlang-cookie\") pod \"4972ddb3-00a4-458d-864f-bf101ef508c7\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.632273 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-server-conf\") pod \"4972ddb3-00a4-458d-864f-bf101ef508c7\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.632308 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-plugins\") pod \"4972ddb3-00a4-458d-864f-bf101ef508c7\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.632325 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4972ddb3-00a4-458d-864f-bf101ef508c7-erlang-cookie-secret\") pod \"4972ddb3-00a4-458d-864f-bf101ef508c7\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.632349 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5phw9\" (UniqueName: \"kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-kube-api-access-5phw9\") pod \"4972ddb3-00a4-458d-864f-bf101ef508c7\" (UID: \"4972ddb3-00a4-458d-864f-bf101ef508c7\") " Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.633199 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4972ddb3-00a4-458d-864f-bf101ef508c7" (UID: "4972ddb3-00a4-458d-864f-bf101ef508c7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.633328 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4972ddb3-00a4-458d-864f-bf101ef508c7" (UID: "4972ddb3-00a4-458d-864f-bf101ef508c7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.633723 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4972ddb3-00a4-458d-864f-bf101ef508c7" (UID: "4972ddb3-00a4-458d-864f-bf101ef508c7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.639166 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4972ddb3-00a4-458d-864f-bf101ef508c7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4972ddb3-00a4-458d-864f-bf101ef508c7" (UID: "4972ddb3-00a4-458d-864f-bf101ef508c7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.639733 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-kube-api-access-5phw9" (OuterVolumeSpecName: "kube-api-access-5phw9") pod "4972ddb3-00a4-458d-864f-bf101ef508c7" (UID: "4972ddb3-00a4-458d-864f-bf101ef508c7"). InnerVolumeSpecName "kube-api-access-5phw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.645889 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4972ddb3-00a4-458d-864f-bf101ef508c7-pod-info" (OuterVolumeSpecName: "pod-info") pod "4972ddb3-00a4-458d-864f-bf101ef508c7" (UID: "4972ddb3-00a4-458d-864f-bf101ef508c7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.658494 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-server-conf" (OuterVolumeSpecName: "server-conf") pod "4972ddb3-00a4-458d-864f-bf101ef508c7" (UID: "4972ddb3-00a4-458d-864f-bf101ef508c7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.663763 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a" (OuterVolumeSpecName: "persistence") pod "4972ddb3-00a4-458d-864f-bf101ef508c7" (UID: "4972ddb3-00a4-458d-864f-bf101ef508c7"). InnerVolumeSpecName "pvc-09e64457-1467-4834-b64c-01f23c1cc33a". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.719397 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4972ddb3-00a4-458d-864f-bf101ef508c7" (UID: "4972ddb3-00a4-458d-864f-bf101ef508c7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.734191 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4972ddb3-00a4-458d-864f-bf101ef508c7-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.734221 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.734231 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.734239 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.734247 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4972ddb3-00a4-458d-864f-bf101ef508c7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.734255 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5phw9\" (UniqueName: \"kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-kube-api-access-5phw9\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.740873 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-09e64457-1467-4834-b64c-01f23c1cc33a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") on node \"crc\" " Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.740903 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4972ddb3-00a4-458d-864f-bf101ef508c7-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.740919 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4972ddb3-00a4-458d-864f-bf101ef508c7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.761159 4763 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.761313 4763 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-09e64457-1467-4834-b64c-01f23c1cc33a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a") on node "crc" Sep 30 14:53:50 crc kubenswrapper[4763]: I0930 14:53:50.842386 4763 reconciler_common.go:293] "Volume detached for volume \"pvc-09e64457-1467-4834-b64c-01f23c1cc33a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.195658 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.350995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzrrn\" (UniqueName: \"kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-kube-api-access-hzrrn\") pod \"182ff55d-2f19-4e59-a425-583a949dad4c\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.351137 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") pod \"182ff55d-2f19-4e59-a425-583a949dad4c\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.351187 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-plugins-conf\") pod \"182ff55d-2f19-4e59-a425-583a949dad4c\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.351206 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-confd\") pod \"182ff55d-2f19-4e59-a425-583a949dad4c\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.351228 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-server-conf\") pod \"182ff55d-2f19-4e59-a425-583a949dad4c\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.351756 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-erlang-cookie\") pod \"182ff55d-2f19-4e59-a425-583a949dad4c\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.351802 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/182ff55d-2f19-4e59-a425-583a949dad4c-pod-info\") pod \"182ff55d-2f19-4e59-a425-583a949dad4c\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.351834 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-plugins\") pod \"182ff55d-2f19-4e59-a425-583a949dad4c\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.351854 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "182ff55d-2f19-4e59-a425-583a949dad4c" (UID: "182ff55d-2f19-4e59-a425-583a949dad4c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.351928 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/182ff55d-2f19-4e59-a425-583a949dad4c-erlang-cookie-secret\") pod \"182ff55d-2f19-4e59-a425-583a949dad4c\" (UID: \"182ff55d-2f19-4e59-a425-583a949dad4c\") " Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.352270 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "182ff55d-2f19-4e59-a425-583a949dad4c" (UID: "182ff55d-2f19-4e59-a425-583a949dad4c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.352296 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.352498 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "182ff55d-2f19-4e59-a425-583a949dad4c" (UID: "182ff55d-2f19-4e59-a425-583a949dad4c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.356822 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/182ff55d-2f19-4e59-a425-583a949dad4c-pod-info" (OuterVolumeSpecName: "pod-info") pod "182ff55d-2f19-4e59-a425-583a949dad4c" (UID: "182ff55d-2f19-4e59-a425-583a949dad4c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.356847 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-kube-api-access-hzrrn" (OuterVolumeSpecName: "kube-api-access-hzrrn") pod "182ff55d-2f19-4e59-a425-583a949dad4c" (UID: "182ff55d-2f19-4e59-a425-583a949dad4c"). InnerVolumeSpecName "kube-api-access-hzrrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.358920 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182ff55d-2f19-4e59-a425-583a949dad4c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "182ff55d-2f19-4e59-a425-583a949dad4c" (UID: "182ff55d-2f19-4e59-a425-583a949dad4c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.367888 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671" (OuterVolumeSpecName: "persistence") pod "182ff55d-2f19-4e59-a425-583a949dad4c" (UID: "182ff55d-2f19-4e59-a425-583a949dad4c"). InnerVolumeSpecName "pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.370022 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-server-conf" (OuterVolumeSpecName: "server-conf") pod "182ff55d-2f19-4e59-a425-583a949dad4c" (UID: "182ff55d-2f19-4e59-a425-583a949dad4c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.428753 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "182ff55d-2f19-4e59-a425-583a949dad4c" (UID: "182ff55d-2f19-4e59-a425-583a949dad4c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.453204 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/182ff55d-2f19-4e59-a425-583a949dad4c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.453255 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzrrn\" (UniqueName: \"kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-kube-api-access-hzrrn\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.453305 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") on node \"crc\" " Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.453326 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.453340 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/182ff55d-2f19-4e59-a425-583a949dad4c-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.453352 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.453360 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/182ff55d-2f19-4e59-a425-583a949dad4c-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.453368 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/182ff55d-2f19-4e59-a425-583a949dad4c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.469147 4763 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.469332 4763 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671") on node "crc" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.496449 4763 generic.go:334] "Generic (PLEG): container finished" podID="182ff55d-2f19-4e59-a425-583a949dad4c" containerID="e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9" exitCode=0 Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.496504 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.496533 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"182ff55d-2f19-4e59-a425-583a949dad4c","Type":"ContainerDied","Data":"e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9"} Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.496885 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"182ff55d-2f19-4e59-a425-583a949dad4c","Type":"ContainerDied","Data":"684186d1d67cdc1fae90f891505bc525bddbe08704a278d92f5ab3667b26ee04"} Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.496913 4763 scope.go:117] "RemoveContainer" containerID="e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.498383 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4972ddb3-00a4-458d-864f-bf101ef508c7","Type":"ContainerDied","Data":"f3fd30e7384a5f9b6992b3cf566b6cc686640a65f727da3bb1fcb3d765672dfc"} Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.498472 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.515884 4763 scope.go:117] "RemoveContainer" containerID="ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.535168 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.535922 4763 scope.go:117] "RemoveContainer" containerID="e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9" Sep 30 14:53:51 crc kubenswrapper[4763]: E0930 14:53:51.536424 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9\": container with ID starting with e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9 not found: ID does not exist" containerID="e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.536473 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9"} err="failed to get container status \"e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9\": rpc error: code = NotFound desc = could not find container \"e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9\": container with ID starting with e80d997488195bffe30a5a819faeb17cc29ccb27dbd5fba2bc15c61b1db925b9 not found: ID does not exist" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.536498 4763 scope.go:117] "RemoveContainer" containerID="ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b" Sep 30 14:53:51 crc kubenswrapper[4763]: E0930 14:53:51.537928 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b\": container with ID starting with ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b not found: ID does not exist" containerID="ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.537954 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b"} err="failed to get container status \"ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b\": rpc error: code = NotFound desc = could not find container \"ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b\": container with ID starting with ca397b1a627e22a06a4f9ad86cb36ee4a975499aec31f4f51b0754e534f8ed8b not found: ID does not exist" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.537968 4763 scope.go:117] "RemoveContainer" containerID="6b71ef21e730f12e278ba3a5faf9bb1e0144cbda5049b74d39b4237c30bd9885" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.540196 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.545814 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.554267 4763 reconciler_common.go:293] "Volume detached for volume \"pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.556575 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.556741 4763 scope.go:117] "RemoveContainer" containerID="6bb3b35d0146b678be3d51ab5961ae06d78abeceaf6c7516fb30516d6b4ab030" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.571379 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:53:51 crc kubenswrapper[4763]: E0930 14:53:51.571823 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4972ddb3-00a4-458d-864f-bf101ef508c7" containerName="setup-container" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.571845 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4972ddb3-00a4-458d-864f-bf101ef508c7" containerName="setup-container" Sep 30 14:53:51 crc kubenswrapper[4763]: E0930 14:53:51.571862 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182ff55d-2f19-4e59-a425-583a949dad4c" containerName="setup-container" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.571871 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="182ff55d-2f19-4e59-a425-583a949dad4c" containerName="setup-container" Sep 30 14:53:51 crc kubenswrapper[4763]: E0930 14:53:51.571893 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4972ddb3-00a4-458d-864f-bf101ef508c7" containerName="rabbitmq" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.571902 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4972ddb3-00a4-458d-864f-bf101ef508c7" containerName="rabbitmq" Sep 30 14:53:51 crc kubenswrapper[4763]: E0930 14:53:51.571921 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182ff55d-2f19-4e59-a425-583a949dad4c" containerName="rabbitmq" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.571929 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="182ff55d-2f19-4e59-a425-583a949dad4c" containerName="rabbitmq" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.572118 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4972ddb3-00a4-458d-864f-bf101ef508c7" containerName="rabbitmq" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.572147 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="182ff55d-2f19-4e59-a425-583a949dad4c" containerName="rabbitmq" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.573153 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.574949 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.575225 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.575348 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.575458 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.575579 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-bfg25" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.585153 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.595775 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.597223 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.602967 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.602972 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.603110 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.603189 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mgrb6" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.603235 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.612734 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.655073 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.655140 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.655170 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.655317 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.655436 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.655502 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.655543 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.655570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.655653 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r42jm\" (UniqueName: \"kubernetes.io/projected/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-kube-api-access-r42jm\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.756837 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6a0ad5b-1256-433d-b04a-aa120b250440-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.756884 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6a0ad5b-1256-433d-b04a-aa120b250440-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.756926 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757064 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6a0ad5b-1256-433d-b04a-aa120b250440-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757182 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757206 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757271 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2ggq\" (UniqueName: \"kubernetes.io/projected/f6a0ad5b-1256-433d-b04a-aa120b250440-kube-api-access-l2ggq\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757293 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r42jm\" (UniqueName: \"kubernetes.io/projected/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-kube-api-access-r42jm\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757330 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757376 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-09e64457-1467-4834-b64c-01f23c1cc33a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757412 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6a0ad5b-1256-433d-b04a-aa120b250440-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757430 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6a0ad5b-1256-433d-b04a-aa120b250440-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757659 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757725 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757763 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6a0ad5b-1256-433d-b04a-aa120b250440-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757812 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6a0ad5b-1256-433d-b04a-aa120b250440-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.757937 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.758343 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.758668 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.758842 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.760793 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.760857 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/adddfef9fb30291f48b6bccb14fbabcdc963ca594408526e8a49d5a40750beac/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.761497 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.761516 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.763245 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.772698 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r42jm\" (UniqueName: \"kubernetes.io/projected/d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207-kube-api-access-r42jm\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.785969 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d393d84d-60a5-4848-86c2-0b7a1cd60671\") pod \"rabbitmq-cell1-server-0\" (UID: \"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.859391 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6a0ad5b-1256-433d-b04a-aa120b250440-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.859440 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6a0ad5b-1256-433d-b04a-aa120b250440-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.859465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6a0ad5b-1256-433d-b04a-aa120b250440-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.859483 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6a0ad5b-1256-433d-b04a-aa120b250440-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.859524 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6a0ad5b-1256-433d-b04a-aa120b250440-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.859564 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2ggq\" (UniqueName: \"kubernetes.io/projected/f6a0ad5b-1256-433d-b04a-aa120b250440-kube-api-access-l2ggq\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.859620 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-09e64457-1467-4834-b64c-01f23c1cc33a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.859646 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6a0ad5b-1256-433d-b04a-aa120b250440-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.859663 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6a0ad5b-1256-433d-b04a-aa120b250440-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.860169 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6a0ad5b-1256-433d-b04a-aa120b250440-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.860314 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6a0ad5b-1256-433d-b04a-aa120b250440-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.860970 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6a0ad5b-1256-433d-b04a-aa120b250440-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.861260 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6a0ad5b-1256-433d-b04a-aa120b250440-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.863016 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6a0ad5b-1256-433d-b04a-aa120b250440-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.863406 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6a0ad5b-1256-433d-b04a-aa120b250440-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.864416 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6a0ad5b-1256-433d-b04a-aa120b250440-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.864433 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.864470 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-09e64457-1467-4834-b64c-01f23c1cc33a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9cce1aac6127f0ea8a55e8cb08c10c0074ed4c07f0c00b05440c84be8d071db5/globalmount\"" pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.878154 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2ggq\" (UniqueName: \"kubernetes.io/projected/f6a0ad5b-1256-433d-b04a-aa120b250440-kube-api-access-l2ggq\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.892027 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-09e64457-1467-4834-b64c-01f23c1cc33a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09e64457-1467-4834-b64c-01f23c1cc33a\") pod \"rabbitmq-server-0\" (UID: \"f6a0ad5b-1256-433d-b04a-aa120b250440\") " pod="openstack/rabbitmq-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.899982 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:53:51 crc kubenswrapper[4763]: I0930 14:53:51.925119 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.086542 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.140361 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-shfzc"] Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.140573 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" podUID="5427eafc-b753-4e2f-92bc-b0f4ccbd357c" containerName="dnsmasq-dns" containerID="cri-o://9179905f9defb87b3a1a00342be6532d21f5372388d383b53351c94998a1ab0f" gracePeriod=10 Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.379532 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.492034 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.515349 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="182ff55d-2f19-4e59-a425-583a949dad4c" path="/var/lib/kubelet/pods/182ff55d-2f19-4e59-a425-583a949dad4c/volumes" Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.516454 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4972ddb3-00a4-458d-864f-bf101ef508c7" path="/var/lib/kubelet/pods/4972ddb3-00a4-458d-864f-bf101ef508c7/volumes" Sep 30 14:53:52 crc kubenswrapper[4763]: W0930 14:53:52.527903 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6a0ad5b_1256_433d_b04a_aa120b250440.slice/crio-21f8206ceb84d67fd5304888dcdb146e0c1380cacaaea14bf0f6abd0ed1611f1 WatchSource:0}: Error finding container 21f8206ceb84d67fd5304888dcdb146e0c1380cacaaea14bf0f6abd0ed1611f1: Status 404 returned error can't find the container with id 21f8206ceb84d67fd5304888dcdb146e0c1380cacaaea14bf0f6abd0ed1611f1 Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.531518 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207","Type":"ContainerStarted","Data":"886fcd19be640add926a336730d0e491bf9cdcf35057611ac2aabd066a74dfdf"} Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.534468 4763 generic.go:334] "Generic (PLEG): container finished" podID="5427eafc-b753-4e2f-92bc-b0f4ccbd357c" containerID="9179905f9defb87b3a1a00342be6532d21f5372388d383b53351c94998a1ab0f" exitCode=0 Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.534769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" event={"ID":"5427eafc-b753-4e2f-92bc-b0f4ccbd357c","Type":"ContainerDied","Data":"9179905f9defb87b3a1a00342be6532d21f5372388d383b53351c94998a1ab0f"} Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.601130 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.674522 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-dns-svc\") pod \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.674594 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jjfz\" (UniqueName: \"kubernetes.io/projected/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-kube-api-access-2jjfz\") pod \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.674705 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-config\") pod \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\" (UID: \"5427eafc-b753-4e2f-92bc-b0f4ccbd357c\") " Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.679709 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-kube-api-access-2jjfz" (OuterVolumeSpecName: "kube-api-access-2jjfz") pod "5427eafc-b753-4e2f-92bc-b0f4ccbd357c" (UID: "5427eafc-b753-4e2f-92bc-b0f4ccbd357c"). InnerVolumeSpecName "kube-api-access-2jjfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.726512 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5427eafc-b753-4e2f-92bc-b0f4ccbd357c" (UID: "5427eafc-b753-4e2f-92bc-b0f4ccbd357c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.776898 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.776950 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jjfz\" (UniqueName: \"kubernetes.io/projected/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-kube-api-access-2jjfz\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.801716 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-config" (OuterVolumeSpecName: "config") pod "5427eafc-b753-4e2f-92bc-b0f4ccbd357c" (UID: "5427eafc-b753-4e2f-92bc-b0f4ccbd357c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:53:52 crc kubenswrapper[4763]: I0930 14:53:52.877930 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5427eafc-b753-4e2f-92bc-b0f4ccbd357c-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:53 crc kubenswrapper[4763]: I0930 14:53:53.548999 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" Sep 30 14:53:53 crc kubenswrapper[4763]: I0930 14:53:53.549000 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f455d6d69-shfzc" event={"ID":"5427eafc-b753-4e2f-92bc-b0f4ccbd357c","Type":"ContainerDied","Data":"8d966481e65e92d8e4223b20d842cbdc3dd00f46d2a5db4ad8d52a650d2e6540"} Sep 30 14:53:53 crc kubenswrapper[4763]: I0930 14:53:53.549715 4763 scope.go:117] "RemoveContainer" containerID="9179905f9defb87b3a1a00342be6532d21f5372388d383b53351c94998a1ab0f" Sep 30 14:53:53 crc kubenswrapper[4763]: I0930 14:53:53.551068 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f6a0ad5b-1256-433d-b04a-aa120b250440","Type":"ContainerStarted","Data":"2f01808cfe53fa25ad98e0fabad725d9d1cfffb21c346c715b883b32436c8638"} Sep 30 14:53:53 crc kubenswrapper[4763]: I0930 14:53:53.551103 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f6a0ad5b-1256-433d-b04a-aa120b250440","Type":"ContainerStarted","Data":"21f8206ceb84d67fd5304888dcdb146e0c1380cacaaea14bf0f6abd0ed1611f1"} Sep 30 14:53:53 crc kubenswrapper[4763]: I0930 14:53:53.553458 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207","Type":"ContainerStarted","Data":"c261b153f5d3d69c20c64a6e66af45e8b6c1d72010be82ebf3ca7951f86a0b68"} Sep 30 14:53:53 crc kubenswrapper[4763]: I0930 14:53:53.578166 4763 scope.go:117] "RemoveContainer" containerID="bc464c583bead03d4da0cbd5c433c1345a7342a20924e79d526fb34e54dd5ee8" Sep 30 14:53:53 crc kubenswrapper[4763]: I0930 14:53:53.593344 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-shfzc"] Sep 30 14:53:53 crc kubenswrapper[4763]: I0930 14:53:53.598549 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-shfzc"] Sep 30 14:53:54 crc kubenswrapper[4763]: I0930 14:53:54.499995 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5427eafc-b753-4e2f-92bc-b0f4ccbd357c" path="/var/lib/kubelet/pods/5427eafc-b753-4e2f-92bc-b0f4ccbd357c/volumes" Sep 30 14:54:06 crc kubenswrapper[4763]: I0930 14:54:06.060292 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:54:06 crc kubenswrapper[4763]: I0930 14:54:06.061028 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:54:23 crc kubenswrapper[4763]: I0930 14:54:23.774969 4763 generic.go:334] "Generic (PLEG): container finished" podID="d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207" containerID="c261b153f5d3d69c20c64a6e66af45e8b6c1d72010be82ebf3ca7951f86a0b68" exitCode=0 Sep 30 14:54:23 crc kubenswrapper[4763]: I0930 14:54:23.775229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207","Type":"ContainerDied","Data":"c261b153f5d3d69c20c64a6e66af45e8b6c1d72010be82ebf3ca7951f86a0b68"} Sep 30 14:54:23 crc kubenswrapper[4763]: I0930 14:54:23.779549 4763 generic.go:334] "Generic (PLEG): container finished" podID="f6a0ad5b-1256-433d-b04a-aa120b250440" containerID="2f01808cfe53fa25ad98e0fabad725d9d1cfffb21c346c715b883b32436c8638" exitCode=0 Sep 30 14:54:23 crc kubenswrapper[4763]: I0930 14:54:23.779582 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f6a0ad5b-1256-433d-b04a-aa120b250440","Type":"ContainerDied","Data":"2f01808cfe53fa25ad98e0fabad725d9d1cfffb21c346c715b883b32436c8638"} Sep 30 14:54:24 crc kubenswrapper[4763]: I0930 14:54:24.790376 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f6a0ad5b-1256-433d-b04a-aa120b250440","Type":"ContainerStarted","Data":"09ab8e581016156c12f14d59bca0b4b18bc52cbe430b4271da8a55db57db1a23"} Sep 30 14:54:24 crc kubenswrapper[4763]: I0930 14:54:24.791275 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 14:54:24 crc kubenswrapper[4763]: I0930 14:54:24.793008 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207","Type":"ContainerStarted","Data":"dd50b6896619f0bfdb2b6634a25433705a18ec2a70528910e4f9f337cc2a0b21"} Sep 30 14:54:24 crc kubenswrapper[4763]: I0930 14:54:24.793384 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:54:24 crc kubenswrapper[4763]: I0930 14:54:24.813333 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=33.813311723 podStartE2EDuration="33.813311723s" podCreationTimestamp="2025-09-30 14:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:54:24.811329053 +0000 UTC m=+4736.949889358" watchObservedRunningTime="2025-09-30 14:54:24.813311723 +0000 UTC m=+4736.951872018" Sep 30 14:54:24 crc kubenswrapper[4763]: I0930 14:54:24.837558 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=33.837541527 podStartE2EDuration="33.837541527s" podCreationTimestamp="2025-09-30 14:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:54:24.836154482 +0000 UTC m=+4736.974714837" watchObservedRunningTime="2025-09-30 14:54:24.837541527 +0000 UTC m=+4736.976101812" Sep 30 14:54:36 crc kubenswrapper[4763]: I0930 14:54:36.059972 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:54:36 crc kubenswrapper[4763]: I0930 14:54:36.060671 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:54:36 crc kubenswrapper[4763]: I0930 14:54:36.060728 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:54:36 crc kubenswrapper[4763]: I0930 14:54:36.061180 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4da597f3a8b25df538033d6e0e1f219426e990b064711c97664abf3092e45110"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:54:36 crc kubenswrapper[4763]: I0930 14:54:36.061231 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://4da597f3a8b25df538033d6e0e1f219426e990b064711c97664abf3092e45110" gracePeriod=600 Sep 30 14:54:36 crc kubenswrapper[4763]: I0930 14:54:36.909982 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="4da597f3a8b25df538033d6e0e1f219426e990b064711c97664abf3092e45110" exitCode=0 Sep 30 14:54:36 crc kubenswrapper[4763]: I0930 14:54:36.910044 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"4da597f3a8b25df538033d6e0e1f219426e990b064711c97664abf3092e45110"} Sep 30 14:54:36 crc kubenswrapper[4763]: I0930 14:54:36.911163 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3"} Sep 30 14:54:36 crc kubenswrapper[4763]: I0930 14:54:36.911236 4763 scope.go:117] "RemoveContainer" containerID="baf7eab0550424b40be822bfe126b887bfcd6a1ac094a673fd4b636e46c0e51e" Sep 30 14:54:41 crc kubenswrapper[4763]: I0930 14:54:41.902984 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:54:41 crc kubenswrapper[4763]: I0930 14:54:41.934852 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 14:54:48 crc kubenswrapper[4763]: I0930 14:54:48.735989 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Sep 30 14:54:48 crc kubenswrapper[4763]: E0930 14:54:48.736886 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5427eafc-b753-4e2f-92bc-b0f4ccbd357c" containerName="init" Sep 30 14:54:48 crc kubenswrapper[4763]: I0930 14:54:48.736905 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5427eafc-b753-4e2f-92bc-b0f4ccbd357c" containerName="init" Sep 30 14:54:48 crc kubenswrapper[4763]: E0930 14:54:48.736938 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5427eafc-b753-4e2f-92bc-b0f4ccbd357c" containerName="dnsmasq-dns" Sep 30 14:54:48 crc kubenswrapper[4763]: I0930 14:54:48.736944 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5427eafc-b753-4e2f-92bc-b0f4ccbd357c" containerName="dnsmasq-dns" Sep 30 14:54:48 crc kubenswrapper[4763]: I0930 14:54:48.737106 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5427eafc-b753-4e2f-92bc-b0f4ccbd357c" containerName="dnsmasq-dns" Sep 30 14:54:48 crc kubenswrapper[4763]: I0930 14:54:48.737585 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Sep 30 14:54:48 crc kubenswrapper[4763]: I0930 14:54:48.745795 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rzn4l" Sep 30 14:54:48 crc kubenswrapper[4763]: I0930 14:54:48.762638 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Sep 30 14:54:48 crc kubenswrapper[4763]: I0930 14:54:48.848328 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmpl6\" (UniqueName: \"kubernetes.io/projected/30e72af6-44f8-4305-83c1-4f89522b56f7-kube-api-access-tmpl6\") pod \"mariadb-client-1-default\" (UID: \"30e72af6-44f8-4305-83c1-4f89522b56f7\") " pod="openstack/mariadb-client-1-default" Sep 30 14:54:48 crc kubenswrapper[4763]: I0930 14:54:48.949222 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmpl6\" (UniqueName: \"kubernetes.io/projected/30e72af6-44f8-4305-83c1-4f89522b56f7-kube-api-access-tmpl6\") pod \"mariadb-client-1-default\" (UID: \"30e72af6-44f8-4305-83c1-4f89522b56f7\") " pod="openstack/mariadb-client-1-default" Sep 30 14:54:48 crc kubenswrapper[4763]: I0930 14:54:48.967326 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmpl6\" (UniqueName: \"kubernetes.io/projected/30e72af6-44f8-4305-83c1-4f89522b56f7-kube-api-access-tmpl6\") pod \"mariadb-client-1-default\" (UID: \"30e72af6-44f8-4305-83c1-4f89522b56f7\") " pod="openstack/mariadb-client-1-default" Sep 30 14:54:49 crc kubenswrapper[4763]: I0930 14:54:49.059192 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Sep 30 14:54:49 crc kubenswrapper[4763]: I0930 14:54:49.516362 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Sep 30 14:54:49 crc kubenswrapper[4763]: W0930 14:54:49.519018 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30e72af6_44f8_4305_83c1_4f89522b56f7.slice/crio-2e43789269e167d7aeb1f4de4a61b0f39f501fa288dd95197373778bf5b7756e WatchSource:0}: Error finding container 2e43789269e167d7aeb1f4de4a61b0f39f501fa288dd95197373778bf5b7756e: Status 404 returned error can't find the container with id 2e43789269e167d7aeb1f4de4a61b0f39f501fa288dd95197373778bf5b7756e Sep 30 14:54:50 crc kubenswrapper[4763]: I0930 14:54:50.016705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"30e72af6-44f8-4305-83c1-4f89522b56f7","Type":"ContainerStarted","Data":"2e43789269e167d7aeb1f4de4a61b0f39f501fa288dd95197373778bf5b7756e"} Sep 30 14:54:51 crc kubenswrapper[4763]: I0930 14:54:51.024744 4763 generic.go:334] "Generic (PLEG): container finished" podID="30e72af6-44f8-4305-83c1-4f89522b56f7" containerID="e065e5acbaaa623a8c7f093d617647bf9cce0e7f959da54c1163c5fe6f05d147" exitCode=0 Sep 30 14:54:51 crc kubenswrapper[4763]: I0930 14:54:51.024846 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"30e72af6-44f8-4305-83c1-4f89522b56f7","Type":"ContainerDied","Data":"e065e5acbaaa623a8c7f093d617647bf9cce0e7f959da54c1163c5fe6f05d147"} Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.441982 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.467173 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_30e72af6-44f8-4305-83c1-4f89522b56f7/mariadb-client-1-default/0.log" Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.501140 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.501175 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.558995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmpl6\" (UniqueName: \"kubernetes.io/projected/30e72af6-44f8-4305-83c1-4f89522b56f7-kube-api-access-tmpl6\") pod \"30e72af6-44f8-4305-83c1-4f89522b56f7\" (UID: \"30e72af6-44f8-4305-83c1-4f89522b56f7\") " Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.563689 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e72af6-44f8-4305-83c1-4f89522b56f7-kube-api-access-tmpl6" (OuterVolumeSpecName: "kube-api-access-tmpl6") pod "30e72af6-44f8-4305-83c1-4f89522b56f7" (UID: "30e72af6-44f8-4305-83c1-4f89522b56f7"). InnerVolumeSpecName "kube-api-access-tmpl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.661325 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmpl6\" (UniqueName: \"kubernetes.io/projected/30e72af6-44f8-4305-83c1-4f89522b56f7-kube-api-access-tmpl6\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.933258 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Sep 30 14:54:52 crc kubenswrapper[4763]: E0930 14:54:52.933662 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e72af6-44f8-4305-83c1-4f89522b56f7" containerName="mariadb-client-1-default" Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.933679 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e72af6-44f8-4305-83c1-4f89522b56f7" containerName="mariadb-client-1-default" Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.933808 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e72af6-44f8-4305-83c1-4f89522b56f7" containerName="mariadb-client-1-default" Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.934319 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Sep 30 14:54:52 crc kubenswrapper[4763]: I0930 14:54:52.941135 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Sep 30 14:54:53 crc kubenswrapper[4763]: I0930 14:54:53.041590 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e43789269e167d7aeb1f4de4a61b0f39f501fa288dd95197373778bf5b7756e" Sep 30 14:54:53 crc kubenswrapper[4763]: I0930 14:54:53.041652 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Sep 30 14:54:53 crc kubenswrapper[4763]: I0930 14:54:53.068120 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8x5t\" (UniqueName: \"kubernetes.io/projected/126ba8e6-a2dd-4f92-bd80-6901649e1c44-kube-api-access-l8x5t\") pod \"mariadb-client-2-default\" (UID: \"126ba8e6-a2dd-4f92-bd80-6901649e1c44\") " pod="openstack/mariadb-client-2-default" Sep 30 14:54:53 crc kubenswrapper[4763]: I0930 14:54:53.169938 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8x5t\" (UniqueName: \"kubernetes.io/projected/126ba8e6-a2dd-4f92-bd80-6901649e1c44-kube-api-access-l8x5t\") pod \"mariadb-client-2-default\" (UID: \"126ba8e6-a2dd-4f92-bd80-6901649e1c44\") " pod="openstack/mariadb-client-2-default" Sep 30 14:54:53 crc kubenswrapper[4763]: I0930 14:54:53.188151 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8x5t\" (UniqueName: \"kubernetes.io/projected/126ba8e6-a2dd-4f92-bd80-6901649e1c44-kube-api-access-l8x5t\") pod \"mariadb-client-2-default\" (UID: \"126ba8e6-a2dd-4f92-bd80-6901649e1c44\") " pod="openstack/mariadb-client-2-default" Sep 30 14:54:53 crc kubenswrapper[4763]: I0930 14:54:53.260382 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Sep 30 14:54:53 crc kubenswrapper[4763]: W0930 14:54:53.785749 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod126ba8e6_a2dd_4f92_bd80_6901649e1c44.slice/crio-ed0aac552cac9352d7bba81ad618a6d019b42e59033d11aa98f011688551e089 WatchSource:0}: Error finding container ed0aac552cac9352d7bba81ad618a6d019b42e59033d11aa98f011688551e089: Status 404 returned error can't find the container with id ed0aac552cac9352d7bba81ad618a6d019b42e59033d11aa98f011688551e089 Sep 30 14:54:53 crc kubenswrapper[4763]: I0930 14:54:53.788332 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Sep 30 14:54:54 crc kubenswrapper[4763]: I0930 14:54:54.052609 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"126ba8e6-a2dd-4f92-bd80-6901649e1c44","Type":"ContainerStarted","Data":"717f32c0a5c4d982d61e6442c9721797e5e25a268d90bb745feeeac4223230ca"} Sep 30 14:54:54 crc kubenswrapper[4763]: I0930 14:54:54.052657 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"126ba8e6-a2dd-4f92-bd80-6901649e1c44","Type":"ContainerStarted","Data":"ed0aac552cac9352d7bba81ad618a6d019b42e59033d11aa98f011688551e089"} Sep 30 14:54:54 crc kubenswrapper[4763]: I0930 14:54:54.072315 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=2.07228691 podStartE2EDuration="2.07228691s" podCreationTimestamp="2025-09-30 14:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:54:54.065128972 +0000 UTC m=+4766.203689277" watchObservedRunningTime="2025-09-30 14:54:54.07228691 +0000 UTC m=+4766.210847215" Sep 30 14:54:54 crc kubenswrapper[4763]: I0930 14:54:54.499095 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e72af6-44f8-4305-83c1-4f89522b56f7" path="/var/lib/kubelet/pods/30e72af6-44f8-4305-83c1-4f89522b56f7/volumes" Sep 30 14:54:55 crc kubenswrapper[4763]: I0930 14:54:55.059391 4763 generic.go:334] "Generic (PLEG): container finished" podID="126ba8e6-a2dd-4f92-bd80-6901649e1c44" containerID="717f32c0a5c4d982d61e6442c9721797e5e25a268d90bb745feeeac4223230ca" exitCode=0 Sep 30 14:54:55 crc kubenswrapper[4763]: I0930 14:54:55.060164 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"126ba8e6-a2dd-4f92-bd80-6901649e1c44","Type":"ContainerDied","Data":"717f32c0a5c4d982d61e6442c9721797e5e25a268d90bb745feeeac4223230ca"} Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.381193 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.413846 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.420246 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.519544 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8x5t\" (UniqueName: \"kubernetes.io/projected/126ba8e6-a2dd-4f92-bd80-6901649e1c44-kube-api-access-l8x5t\") pod \"126ba8e6-a2dd-4f92-bd80-6901649e1c44\" (UID: \"126ba8e6-a2dd-4f92-bd80-6901649e1c44\") " Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.524163 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126ba8e6-a2dd-4f92-bd80-6901649e1c44-kube-api-access-l8x5t" (OuterVolumeSpecName: "kube-api-access-l8x5t") pod "126ba8e6-a2dd-4f92-bd80-6901649e1c44" (UID: "126ba8e6-a2dd-4f92-bd80-6901649e1c44"). InnerVolumeSpecName "kube-api-access-l8x5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.621631 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8x5t\" (UniqueName: \"kubernetes.io/projected/126ba8e6-a2dd-4f92-bd80-6901649e1c44-kube-api-access-l8x5t\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.914017 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Sep 30 14:54:56 crc kubenswrapper[4763]: E0930 14:54:56.914411 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126ba8e6-a2dd-4f92-bd80-6901649e1c44" containerName="mariadb-client-2-default" Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.914430 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="126ba8e6-a2dd-4f92-bd80-6901649e1c44" containerName="mariadb-client-2-default" Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.914693 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="126ba8e6-a2dd-4f92-bd80-6901649e1c44" containerName="mariadb-client-2-default" Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.915678 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Sep 30 14:54:56 crc kubenswrapper[4763]: I0930 14:54:56.922316 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Sep 30 14:54:57 crc kubenswrapper[4763]: I0930 14:54:57.028252 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcv7d\" (UniqueName: \"kubernetes.io/projected/2e118158-5dd8-4cf0-a833-bff15d2eefe8-kube-api-access-mcv7d\") pod \"mariadb-client-1\" (UID: \"2e118158-5dd8-4cf0-a833-bff15d2eefe8\") " pod="openstack/mariadb-client-1" Sep 30 14:54:57 crc kubenswrapper[4763]: I0930 14:54:57.074649 4763 scope.go:117] "RemoveContainer" containerID="717f32c0a5c4d982d61e6442c9721797e5e25a268d90bb745feeeac4223230ca" Sep 30 14:54:57 crc kubenswrapper[4763]: I0930 14:54:57.074748 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Sep 30 14:54:57 crc kubenswrapper[4763]: I0930 14:54:57.130202 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcv7d\" (UniqueName: \"kubernetes.io/projected/2e118158-5dd8-4cf0-a833-bff15d2eefe8-kube-api-access-mcv7d\") pod \"mariadb-client-1\" (UID: \"2e118158-5dd8-4cf0-a833-bff15d2eefe8\") " pod="openstack/mariadb-client-1" Sep 30 14:54:57 crc kubenswrapper[4763]: I0930 14:54:57.147712 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcv7d\" (UniqueName: \"kubernetes.io/projected/2e118158-5dd8-4cf0-a833-bff15d2eefe8-kube-api-access-mcv7d\") pod \"mariadb-client-1\" (UID: \"2e118158-5dd8-4cf0-a833-bff15d2eefe8\") " pod="openstack/mariadb-client-1" Sep 30 14:54:57 crc kubenswrapper[4763]: I0930 14:54:57.233398 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Sep 30 14:54:57 crc kubenswrapper[4763]: I0930 14:54:57.711116 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Sep 30 14:54:58 crc kubenswrapper[4763]: I0930 14:54:58.088129 4763 generic.go:334] "Generic (PLEG): container finished" podID="2e118158-5dd8-4cf0-a833-bff15d2eefe8" containerID="9b928cce12141f53b983ad55837a56b815109996c9286159819152492255b8a8" exitCode=0 Sep 30 14:54:58 crc kubenswrapper[4763]: I0930 14:54:58.088183 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2e118158-5dd8-4cf0-a833-bff15d2eefe8","Type":"ContainerDied","Data":"9b928cce12141f53b983ad55837a56b815109996c9286159819152492255b8a8"} Sep 30 14:54:58 crc kubenswrapper[4763]: I0930 14:54:58.088425 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2e118158-5dd8-4cf0-a833-bff15d2eefe8","Type":"ContainerStarted","Data":"f4f8bb59826527184195f65aaeb34c4c75eeedb70258374d6c67118558913254"} Sep 30 14:54:58 crc kubenswrapper[4763]: I0930 14:54:58.501811 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126ba8e6-a2dd-4f92-bd80-6901649e1c44" path="/var/lib/kubelet/pods/126ba8e6-a2dd-4f92-bd80-6901649e1c44/volumes" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.089533 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.112698 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2e118158-5dd8-4cf0-a833-bff15d2eefe8","Type":"ContainerDied","Data":"f4f8bb59826527184195f65aaeb34c4c75eeedb70258374d6c67118558913254"} Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.112988 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4f8bb59826527184195f65aaeb34c4c75eeedb70258374d6c67118558913254" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.112849 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.112763 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_2e118158-5dd8-4cf0-a833-bff15d2eefe8/mariadb-client-1/0.log" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.142853 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.148660 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.175439 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcv7d\" (UniqueName: \"kubernetes.io/projected/2e118158-5dd8-4cf0-a833-bff15d2eefe8-kube-api-access-mcv7d\") pod \"2e118158-5dd8-4cf0-a833-bff15d2eefe8\" (UID: \"2e118158-5dd8-4cf0-a833-bff15d2eefe8\") " Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.188805 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e118158-5dd8-4cf0-a833-bff15d2eefe8-kube-api-access-mcv7d" (OuterVolumeSpecName: "kube-api-access-mcv7d") pod "2e118158-5dd8-4cf0-a833-bff15d2eefe8" (UID: "2e118158-5dd8-4cf0-a833-bff15d2eefe8"). InnerVolumeSpecName "kube-api-access-mcv7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.277665 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcv7d\" (UniqueName: \"kubernetes.io/projected/2e118158-5dd8-4cf0-a833-bff15d2eefe8-kube-api-access-mcv7d\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.499095 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e118158-5dd8-4cf0-a833-bff15d2eefe8" path="/var/lib/kubelet/pods/2e118158-5dd8-4cf0-a833-bff15d2eefe8/volumes" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.618765 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Sep 30 14:55:00 crc kubenswrapper[4763]: E0930 14:55:00.619193 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e118158-5dd8-4cf0-a833-bff15d2eefe8" containerName="mariadb-client-1" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.619218 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e118158-5dd8-4cf0-a833-bff15d2eefe8" containerName="mariadb-client-1" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.619513 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e118158-5dd8-4cf0-a833-bff15d2eefe8" containerName="mariadb-client-1" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.620257 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.623943 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rzn4l" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.628277 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.783586 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzwhb\" (UniqueName: \"kubernetes.io/projected/18b5db4f-ad1f-4f6b-a24f-0079126bb51e-kube-api-access-kzwhb\") pod \"mariadb-client-4-default\" (UID: \"18b5db4f-ad1f-4f6b-a24f-0079126bb51e\") " pod="openstack/mariadb-client-4-default" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.884883 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzwhb\" (UniqueName: \"kubernetes.io/projected/18b5db4f-ad1f-4f6b-a24f-0079126bb51e-kube-api-access-kzwhb\") pod \"mariadb-client-4-default\" (UID: \"18b5db4f-ad1f-4f6b-a24f-0079126bb51e\") " pod="openstack/mariadb-client-4-default" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.902400 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzwhb\" (UniqueName: \"kubernetes.io/projected/18b5db4f-ad1f-4f6b-a24f-0079126bb51e-kube-api-access-kzwhb\") pod \"mariadb-client-4-default\" (UID: \"18b5db4f-ad1f-4f6b-a24f-0079126bb51e\") " pod="openstack/mariadb-client-4-default" Sep 30 14:55:00 crc kubenswrapper[4763]: I0930 14:55:00.941454 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Sep 30 14:55:01 crc kubenswrapper[4763]: I0930 14:55:01.400988 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Sep 30 14:55:01 crc kubenswrapper[4763]: W0930 14:55:01.405715 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b5db4f_ad1f_4f6b_a24f_0079126bb51e.slice/crio-565956c8922348dbb8772354353b65a999f36d961347109c626bb67edb15e77e WatchSource:0}: Error finding container 565956c8922348dbb8772354353b65a999f36d961347109c626bb67edb15e77e: Status 404 returned error can't find the container with id 565956c8922348dbb8772354353b65a999f36d961347109c626bb67edb15e77e Sep 30 14:55:02 crc kubenswrapper[4763]: I0930 14:55:02.149039 4763 generic.go:334] "Generic (PLEG): container finished" podID="18b5db4f-ad1f-4f6b-a24f-0079126bb51e" containerID="f88b90e5d60bd578d52a548ee890b8d8c8a00a4f9c0f6cf2f86d456a1c0eef2c" exitCode=0 Sep 30 14:55:02 crc kubenswrapper[4763]: I0930 14:55:02.149116 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"18b5db4f-ad1f-4f6b-a24f-0079126bb51e","Type":"ContainerDied","Data":"f88b90e5d60bd578d52a548ee890b8d8c8a00a4f9c0f6cf2f86d456a1c0eef2c"} Sep 30 14:55:02 crc kubenswrapper[4763]: I0930 14:55:02.149866 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"18b5db4f-ad1f-4f6b-a24f-0079126bb51e","Type":"ContainerStarted","Data":"565956c8922348dbb8772354353b65a999f36d961347109c626bb67edb15e77e"} Sep 30 14:55:03 crc kubenswrapper[4763]: I0930 14:55:03.498170 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Sep 30 14:55:03 crc kubenswrapper[4763]: I0930 14:55:03.522286 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_18b5db4f-ad1f-4f6b-a24f-0079126bb51e/mariadb-client-4-default/0.log" Sep 30 14:55:03 crc kubenswrapper[4763]: I0930 14:55:03.544177 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Sep 30 14:55:03 crc kubenswrapper[4763]: I0930 14:55:03.550032 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Sep 30 14:55:03 crc kubenswrapper[4763]: I0930 14:55:03.639790 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzwhb\" (UniqueName: \"kubernetes.io/projected/18b5db4f-ad1f-4f6b-a24f-0079126bb51e-kube-api-access-kzwhb\") pod \"18b5db4f-ad1f-4f6b-a24f-0079126bb51e\" (UID: \"18b5db4f-ad1f-4f6b-a24f-0079126bb51e\") " Sep 30 14:55:03 crc kubenswrapper[4763]: I0930 14:55:03.653864 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b5db4f-ad1f-4f6b-a24f-0079126bb51e-kube-api-access-kzwhb" (OuterVolumeSpecName: "kube-api-access-kzwhb") pod "18b5db4f-ad1f-4f6b-a24f-0079126bb51e" (UID: "18b5db4f-ad1f-4f6b-a24f-0079126bb51e"). InnerVolumeSpecName "kube-api-access-kzwhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:55:03 crc kubenswrapper[4763]: I0930 14:55:03.741795 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzwhb\" (UniqueName: \"kubernetes.io/projected/18b5db4f-ad1f-4f6b-a24f-0079126bb51e-kube-api-access-kzwhb\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:04 crc kubenswrapper[4763]: I0930 14:55:04.165824 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="565956c8922348dbb8772354353b65a999f36d961347109c626bb67edb15e77e" Sep 30 14:55:04 crc kubenswrapper[4763]: I0930 14:55:04.165930 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Sep 30 14:55:04 crc kubenswrapper[4763]: I0930 14:55:04.506722 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b5db4f-ad1f-4f6b-a24f-0079126bb51e" path="/var/lib/kubelet/pods/18b5db4f-ad1f-4f6b-a24f-0079126bb51e/volumes" Sep 30 14:55:07 crc kubenswrapper[4763]: I0930 14:55:07.784628 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Sep 30 14:55:07 crc kubenswrapper[4763]: E0930 14:55:07.787151 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b5db4f-ad1f-4f6b-a24f-0079126bb51e" containerName="mariadb-client-4-default" Sep 30 14:55:07 crc kubenswrapper[4763]: I0930 14:55:07.787189 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b5db4f-ad1f-4f6b-a24f-0079126bb51e" containerName="mariadb-client-4-default" Sep 30 14:55:07 crc kubenswrapper[4763]: I0930 14:55:07.787405 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b5db4f-ad1f-4f6b-a24f-0079126bb51e" containerName="mariadb-client-4-default" Sep 30 14:55:07 crc kubenswrapper[4763]: I0930 14:55:07.788179 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Sep 30 14:55:07 crc kubenswrapper[4763]: I0930 14:55:07.790790 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Sep 30 14:55:07 crc kubenswrapper[4763]: I0930 14:55:07.791077 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rzn4l" Sep 30 14:55:07 crc kubenswrapper[4763]: I0930 14:55:07.908016 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28qdz\" (UniqueName: \"kubernetes.io/projected/8084834c-5b04-43d1-a537-48467af4388b-kube-api-access-28qdz\") pod \"mariadb-client-5-default\" (UID: \"8084834c-5b04-43d1-a537-48467af4388b\") " pod="openstack/mariadb-client-5-default" Sep 30 14:55:08 crc kubenswrapper[4763]: I0930 14:55:08.009293 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28qdz\" (UniqueName: \"kubernetes.io/projected/8084834c-5b04-43d1-a537-48467af4388b-kube-api-access-28qdz\") pod \"mariadb-client-5-default\" (UID: \"8084834c-5b04-43d1-a537-48467af4388b\") " pod="openstack/mariadb-client-5-default" Sep 30 14:55:08 crc kubenswrapper[4763]: I0930 14:55:08.053919 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28qdz\" (UniqueName: \"kubernetes.io/projected/8084834c-5b04-43d1-a537-48467af4388b-kube-api-access-28qdz\") pod \"mariadb-client-5-default\" (UID: \"8084834c-5b04-43d1-a537-48467af4388b\") " pod="openstack/mariadb-client-5-default" Sep 30 14:55:08 crc kubenswrapper[4763]: I0930 14:55:08.108948 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Sep 30 14:55:08 crc kubenswrapper[4763]: I0930 14:55:08.634710 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Sep 30 14:55:09 crc kubenswrapper[4763]: I0930 14:55:09.203542 4763 generic.go:334] "Generic (PLEG): container finished" podID="8084834c-5b04-43d1-a537-48467af4388b" containerID="80c354a53c4ebc7bf80ad72e734b8cd2bf43bbb044abc71797ed3da2ee1e6448" exitCode=0 Sep 30 14:55:09 crc kubenswrapper[4763]: I0930 14:55:09.203585 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"8084834c-5b04-43d1-a537-48467af4388b","Type":"ContainerDied","Data":"80c354a53c4ebc7bf80ad72e734b8cd2bf43bbb044abc71797ed3da2ee1e6448"} Sep 30 14:55:09 crc kubenswrapper[4763]: I0930 14:55:09.203642 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"8084834c-5b04-43d1-a537-48467af4388b","Type":"ContainerStarted","Data":"b7983039315ee1014392ff6f46f00e866a62ddaa0f2762cb110db2f9473d23b9"} Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.532109 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.556091 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_8084834c-5b04-43d1-a537-48467af4388b/mariadb-client-5-default/0.log" Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.587619 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.597315 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.648046 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28qdz\" (UniqueName: \"kubernetes.io/projected/8084834c-5b04-43d1-a537-48467af4388b-kube-api-access-28qdz\") pod \"8084834c-5b04-43d1-a537-48467af4388b\" (UID: \"8084834c-5b04-43d1-a537-48467af4388b\") " Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.656879 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8084834c-5b04-43d1-a537-48467af4388b-kube-api-access-28qdz" (OuterVolumeSpecName: "kube-api-access-28qdz") pod "8084834c-5b04-43d1-a537-48467af4388b" (UID: "8084834c-5b04-43d1-a537-48467af4388b"). InnerVolumeSpecName "kube-api-access-28qdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.711786 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Sep 30 14:55:10 crc kubenswrapper[4763]: E0930 14:55:10.712157 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8084834c-5b04-43d1-a537-48467af4388b" containerName="mariadb-client-5-default" Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.712175 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8084834c-5b04-43d1-a537-48467af4388b" containerName="mariadb-client-5-default" Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.712350 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8084834c-5b04-43d1-a537-48467af4388b" containerName="mariadb-client-5-default" Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.712874 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.719494 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.751014 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28qdz\" (UniqueName: \"kubernetes.io/projected/8084834c-5b04-43d1-a537-48467af4388b-kube-api-access-28qdz\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.852247 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x67fm\" (UniqueName: \"kubernetes.io/projected/0ce7ef54-c2d2-4da3-b42e-982303adae79-kube-api-access-x67fm\") pod \"mariadb-client-6-default\" (UID: \"0ce7ef54-c2d2-4da3-b42e-982303adae79\") " pod="openstack/mariadb-client-6-default" Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.954041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x67fm\" (UniqueName: \"kubernetes.io/projected/0ce7ef54-c2d2-4da3-b42e-982303adae79-kube-api-access-x67fm\") pod \"mariadb-client-6-default\" (UID: \"0ce7ef54-c2d2-4da3-b42e-982303adae79\") " pod="openstack/mariadb-client-6-default" Sep 30 14:55:10 crc kubenswrapper[4763]: I0930 14:55:10.974254 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x67fm\" (UniqueName: \"kubernetes.io/projected/0ce7ef54-c2d2-4da3-b42e-982303adae79-kube-api-access-x67fm\") pod \"mariadb-client-6-default\" (UID: \"0ce7ef54-c2d2-4da3-b42e-982303adae79\") " pod="openstack/mariadb-client-6-default" Sep 30 14:55:11 crc kubenswrapper[4763]: I0930 14:55:11.032064 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Sep 30 14:55:11 crc kubenswrapper[4763]: I0930 14:55:11.222781 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7983039315ee1014392ff6f46f00e866a62ddaa0f2762cb110db2f9473d23b9" Sep 30 14:55:11 crc kubenswrapper[4763]: I0930 14:55:11.222851 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Sep 30 14:55:11 crc kubenswrapper[4763]: I0930 14:55:11.292328 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Sep 30 14:55:11 crc kubenswrapper[4763]: W0930 14:55:11.298040 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ce7ef54_c2d2_4da3_b42e_982303adae79.slice/crio-457b9ee1bd06e344230ae0e7b8f89e9d50ce7b710a24c6966b0ecb2ec3cb101a WatchSource:0}: Error finding container 457b9ee1bd06e344230ae0e7b8f89e9d50ce7b710a24c6966b0ecb2ec3cb101a: Status 404 returned error can't find the container with id 457b9ee1bd06e344230ae0e7b8f89e9d50ce7b710a24c6966b0ecb2ec3cb101a Sep 30 14:55:12 crc kubenswrapper[4763]: I0930 14:55:12.234461 4763 generic.go:334] "Generic (PLEG): container finished" podID="0ce7ef54-c2d2-4da3-b42e-982303adae79" containerID="f60c3d8bd26c0408138a15217c0cfc50b613532deff50869d04851947ddaa60f" exitCode=0 Sep 30 14:55:12 crc kubenswrapper[4763]: I0930 14:55:12.234516 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"0ce7ef54-c2d2-4da3-b42e-982303adae79","Type":"ContainerDied","Data":"f60c3d8bd26c0408138a15217c0cfc50b613532deff50869d04851947ddaa60f"} Sep 30 14:55:12 crc kubenswrapper[4763]: I0930 14:55:12.234551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"0ce7ef54-c2d2-4da3-b42e-982303adae79","Type":"ContainerStarted","Data":"457b9ee1bd06e344230ae0e7b8f89e9d50ce7b710a24c6966b0ecb2ec3cb101a"} Sep 30 14:55:12 crc kubenswrapper[4763]: I0930 14:55:12.502982 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8084834c-5b04-43d1-a537-48467af4388b" path="/var/lib/kubelet/pods/8084834c-5b04-43d1-a537-48467af4388b/volumes" Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.603797 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.642705 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_0ce7ef54-c2d2-4da3-b42e-982303adae79/mariadb-client-6-default/0.log" Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.668918 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.674196 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.694692 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x67fm\" (UniqueName: \"kubernetes.io/projected/0ce7ef54-c2d2-4da3-b42e-982303adae79-kube-api-access-x67fm\") pod \"0ce7ef54-c2d2-4da3-b42e-982303adae79\" (UID: \"0ce7ef54-c2d2-4da3-b42e-982303adae79\") " Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.699738 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce7ef54-c2d2-4da3-b42e-982303adae79-kube-api-access-x67fm" (OuterVolumeSpecName: "kube-api-access-x67fm") pod "0ce7ef54-c2d2-4da3-b42e-982303adae79" (UID: "0ce7ef54-c2d2-4da3-b42e-982303adae79"). InnerVolumeSpecName "kube-api-access-x67fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.796260 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x67fm\" (UniqueName: \"kubernetes.io/projected/0ce7ef54-c2d2-4da3-b42e-982303adae79-kube-api-access-x67fm\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.827145 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Sep 30 14:55:13 crc kubenswrapper[4763]: E0930 14:55:13.827739 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce7ef54-c2d2-4da3-b42e-982303adae79" containerName="mariadb-client-6-default" Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.827828 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce7ef54-c2d2-4da3-b42e-982303adae79" containerName="mariadb-client-6-default" Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.828083 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce7ef54-c2d2-4da3-b42e-982303adae79" containerName="mariadb-client-6-default" Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.828794 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.832567 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Sep 30 14:55:13 crc kubenswrapper[4763]: I0930 14:55:13.898825 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdshf\" (UniqueName: \"kubernetes.io/projected/7e300eb5-22f4-40ad-a5e4-df0c9cefeda4-kube-api-access-xdshf\") pod \"mariadb-client-7-default\" (UID: \"7e300eb5-22f4-40ad-a5e4-df0c9cefeda4\") " pod="openstack/mariadb-client-7-default" Sep 30 14:55:14 crc kubenswrapper[4763]: I0930 14:55:14.000562 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdshf\" (UniqueName: \"kubernetes.io/projected/7e300eb5-22f4-40ad-a5e4-df0c9cefeda4-kube-api-access-xdshf\") pod \"mariadb-client-7-default\" (UID: \"7e300eb5-22f4-40ad-a5e4-df0c9cefeda4\") " pod="openstack/mariadb-client-7-default" Sep 30 14:55:14 crc kubenswrapper[4763]: I0930 14:55:14.015785 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdshf\" (UniqueName: \"kubernetes.io/projected/7e300eb5-22f4-40ad-a5e4-df0c9cefeda4-kube-api-access-xdshf\") pod \"mariadb-client-7-default\" (UID: \"7e300eb5-22f4-40ad-a5e4-df0c9cefeda4\") " pod="openstack/mariadb-client-7-default" Sep 30 14:55:14 crc kubenswrapper[4763]: I0930 14:55:14.152371 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Sep 30 14:55:14 crc kubenswrapper[4763]: I0930 14:55:14.261650 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="457b9ee1bd06e344230ae0e7b8f89e9d50ce7b710a24c6966b0ecb2ec3cb101a" Sep 30 14:55:14 crc kubenswrapper[4763]: I0930 14:55:14.261733 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Sep 30 14:55:14 crc kubenswrapper[4763]: I0930 14:55:14.500690 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce7ef54-c2d2-4da3-b42e-982303adae79" path="/var/lib/kubelet/pods/0ce7ef54-c2d2-4da3-b42e-982303adae79/volumes" Sep 30 14:55:14 crc kubenswrapper[4763]: I0930 14:55:14.652794 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Sep 30 14:55:14 crc kubenswrapper[4763]: W0930 14:55:14.655876 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e300eb5_22f4_40ad_a5e4_df0c9cefeda4.slice/crio-7bf4b614a2e3fc05984a37815603dd3990e61a9d6ce203d387c35aaf9ab41ce4 WatchSource:0}: Error finding container 7bf4b614a2e3fc05984a37815603dd3990e61a9d6ce203d387c35aaf9ab41ce4: Status 404 returned error can't find the container with id 7bf4b614a2e3fc05984a37815603dd3990e61a9d6ce203d387c35aaf9ab41ce4 Sep 30 14:55:15 crc kubenswrapper[4763]: I0930 14:55:15.275484 4763 generic.go:334] "Generic (PLEG): container finished" podID="7e300eb5-22f4-40ad-a5e4-df0c9cefeda4" containerID="e2b6dc7aa26f01802750fd17600bee09fcc81abea89768200dd79a5dea3a3b62" exitCode=0 Sep 30 14:55:15 crc kubenswrapper[4763]: I0930 14:55:15.275560 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"7e300eb5-22f4-40ad-a5e4-df0c9cefeda4","Type":"ContainerDied","Data":"e2b6dc7aa26f01802750fd17600bee09fcc81abea89768200dd79a5dea3a3b62"} Sep 30 14:55:15 crc kubenswrapper[4763]: I0930 14:55:15.275676 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"7e300eb5-22f4-40ad-a5e4-df0c9cefeda4","Type":"ContainerStarted","Data":"7bf4b614a2e3fc05984a37815603dd3990e61a9d6ce203d387c35aaf9ab41ce4"} Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.696267 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.712505 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_7e300eb5-22f4-40ad-a5e4-df0c9cefeda4/mariadb-client-7-default/0.log" Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.734850 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.739172 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.844009 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdshf\" (UniqueName: \"kubernetes.io/projected/7e300eb5-22f4-40ad-a5e4-df0c9cefeda4-kube-api-access-xdshf\") pod \"7e300eb5-22f4-40ad-a5e4-df0c9cefeda4\" (UID: \"7e300eb5-22f4-40ad-a5e4-df0c9cefeda4\") " Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.853866 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e300eb5-22f4-40ad-a5e4-df0c9cefeda4-kube-api-access-xdshf" (OuterVolumeSpecName: "kube-api-access-xdshf") pod "7e300eb5-22f4-40ad-a5e4-df0c9cefeda4" (UID: "7e300eb5-22f4-40ad-a5e4-df0c9cefeda4"). InnerVolumeSpecName "kube-api-access-xdshf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.861365 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Sep 30 14:55:16 crc kubenswrapper[4763]: E0930 14:55:16.861810 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e300eb5-22f4-40ad-a5e4-df0c9cefeda4" containerName="mariadb-client-7-default" Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.861840 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e300eb5-22f4-40ad-a5e4-df0c9cefeda4" containerName="mariadb-client-7-default" Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.862172 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e300eb5-22f4-40ad-a5e4-df0c9cefeda4" containerName="mariadb-client-7-default" Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.863067 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.873876 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.946074 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62dkh\" (UniqueName: \"kubernetes.io/projected/47a89a4c-060f-4efd-8380-9fede3606e3d-kube-api-access-62dkh\") pod \"mariadb-client-2\" (UID: \"47a89a4c-060f-4efd-8380-9fede3606e3d\") " pod="openstack/mariadb-client-2" Sep 30 14:55:16 crc kubenswrapper[4763]: I0930 14:55:16.946283 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdshf\" (UniqueName: \"kubernetes.io/projected/7e300eb5-22f4-40ad-a5e4-df0c9cefeda4-kube-api-access-xdshf\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:17 crc kubenswrapper[4763]: I0930 14:55:17.047407 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62dkh\" (UniqueName: \"kubernetes.io/projected/47a89a4c-060f-4efd-8380-9fede3606e3d-kube-api-access-62dkh\") pod \"mariadb-client-2\" (UID: \"47a89a4c-060f-4efd-8380-9fede3606e3d\") " pod="openstack/mariadb-client-2" Sep 30 14:55:17 crc kubenswrapper[4763]: I0930 14:55:17.064951 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62dkh\" (UniqueName: \"kubernetes.io/projected/47a89a4c-060f-4efd-8380-9fede3606e3d-kube-api-access-62dkh\") pod \"mariadb-client-2\" (UID: \"47a89a4c-060f-4efd-8380-9fede3606e3d\") " pod="openstack/mariadb-client-2" Sep 30 14:55:17 crc kubenswrapper[4763]: I0930 14:55:17.212852 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Sep 30 14:55:17 crc kubenswrapper[4763]: I0930 14:55:17.309961 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bf4b614a2e3fc05984a37815603dd3990e61a9d6ce203d387c35aaf9ab41ce4" Sep 30 14:55:17 crc kubenswrapper[4763]: I0930 14:55:17.310059 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Sep 30 14:55:17 crc kubenswrapper[4763]: I0930 14:55:17.685755 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Sep 30 14:55:17 crc kubenswrapper[4763]: W0930 14:55:17.686558 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47a89a4c_060f_4efd_8380_9fede3606e3d.slice/crio-fdb721c295d3910350eb2b98fe892f1c3e12ec40032bed5f14971b821e8e36de WatchSource:0}: Error finding container fdb721c295d3910350eb2b98fe892f1c3e12ec40032bed5f14971b821e8e36de: Status 404 returned error can't find the container with id fdb721c295d3910350eb2b98fe892f1c3e12ec40032bed5f14971b821e8e36de Sep 30 14:55:18 crc kubenswrapper[4763]: I0930 14:55:18.321730 4763 generic.go:334] "Generic (PLEG): container finished" podID="47a89a4c-060f-4efd-8380-9fede3606e3d" containerID="fbe318bf1311ce7556c6e21fea49f6e875dcfac157ac9c02f2449116aae6120a" exitCode=0 Sep 30 14:55:18 crc kubenswrapper[4763]: I0930 14:55:18.321781 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"47a89a4c-060f-4efd-8380-9fede3606e3d","Type":"ContainerDied","Data":"fbe318bf1311ce7556c6e21fea49f6e875dcfac157ac9c02f2449116aae6120a"} Sep 30 14:55:18 crc kubenswrapper[4763]: I0930 14:55:18.321807 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"47a89a4c-060f-4efd-8380-9fede3606e3d","Type":"ContainerStarted","Data":"fdb721c295d3910350eb2b98fe892f1c3e12ec40032bed5f14971b821e8e36de"} Sep 30 14:55:18 crc kubenswrapper[4763]: I0930 14:55:18.501402 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e300eb5-22f4-40ad-a5e4-df0c9cefeda4" path="/var/lib/kubelet/pods/7e300eb5-22f4-40ad-a5e4-df0c9cefeda4/volumes" Sep 30 14:55:19 crc kubenswrapper[4763]: I0930 14:55:19.712243 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Sep 30 14:55:19 crc kubenswrapper[4763]: I0930 14:55:19.781575 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_47a89a4c-060f-4efd-8380-9fede3606e3d/mariadb-client-2/0.log" Sep 30 14:55:19 crc kubenswrapper[4763]: I0930 14:55:19.792931 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62dkh\" (UniqueName: \"kubernetes.io/projected/47a89a4c-060f-4efd-8380-9fede3606e3d-kube-api-access-62dkh\") pod \"47a89a4c-060f-4efd-8380-9fede3606e3d\" (UID: \"47a89a4c-060f-4efd-8380-9fede3606e3d\") " Sep 30 14:55:19 crc kubenswrapper[4763]: I0930 14:55:19.810062 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a89a4c-060f-4efd-8380-9fede3606e3d-kube-api-access-62dkh" (OuterVolumeSpecName: "kube-api-access-62dkh") pod "47a89a4c-060f-4efd-8380-9fede3606e3d" (UID: "47a89a4c-060f-4efd-8380-9fede3606e3d"). InnerVolumeSpecName "kube-api-access-62dkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:55:19 crc kubenswrapper[4763]: I0930 14:55:19.816905 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Sep 30 14:55:19 crc kubenswrapper[4763]: I0930 14:55:19.821896 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Sep 30 14:55:19 crc kubenswrapper[4763]: I0930 14:55:19.894657 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62dkh\" (UniqueName: \"kubernetes.io/projected/47a89a4c-060f-4efd-8380-9fede3606e3d-kube-api-access-62dkh\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:20 crc kubenswrapper[4763]: I0930 14:55:20.338872 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdb721c295d3910350eb2b98fe892f1c3e12ec40032bed5f14971b821e8e36de" Sep 30 14:55:20 crc kubenswrapper[4763]: I0930 14:55:20.338913 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Sep 30 14:55:20 crc kubenswrapper[4763]: I0930 14:55:20.499755 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a89a4c-060f-4efd-8380-9fede3606e3d" path="/var/lib/kubelet/pods/47a89a4c-060f-4efd-8380-9fede3606e3d/volumes" Sep 30 14:55:34 crc kubenswrapper[4763]: I0930 14:55:34.097955 4763 scope.go:117] "RemoveContainer" containerID="917e9ea7f5ee9f6190661c5e05c014f6c2b24ae163c3eda2ec4717ee524783a3" Sep 30 14:56:36 crc kubenswrapper[4763]: I0930 14:56:36.059964 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:56:36 crc kubenswrapper[4763]: I0930 14:56:36.060409 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:57:06 crc kubenswrapper[4763]: I0930 14:57:06.060202 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:57:06 crc kubenswrapper[4763]: I0930 14:57:06.060814 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:57:17 crc kubenswrapper[4763]: I0930 14:57:17.767100 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-trrf6"] Sep 30 14:57:17 crc kubenswrapper[4763]: E0930 14:57:17.768102 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a89a4c-060f-4efd-8380-9fede3606e3d" containerName="mariadb-client-2" Sep 30 14:57:17 crc kubenswrapper[4763]: I0930 14:57:17.768120 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a89a4c-060f-4efd-8380-9fede3606e3d" containerName="mariadb-client-2" Sep 30 14:57:17 crc kubenswrapper[4763]: I0930 14:57:17.768338 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a89a4c-060f-4efd-8380-9fede3606e3d" containerName="mariadb-client-2" Sep 30 14:57:17 crc kubenswrapper[4763]: I0930 14:57:17.769694 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:17 crc kubenswrapper[4763]: I0930 14:57:17.776229 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-trrf6"] Sep 30 14:57:17 crc kubenswrapper[4763]: I0930 14:57:17.929215 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chzkn\" (UniqueName: \"kubernetes.io/projected/b9a03e08-83b7-4928-b892-02b35b33ab7b-kube-api-access-chzkn\") pod \"certified-operators-trrf6\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:17 crc kubenswrapper[4763]: I0930 14:57:17.929567 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-catalog-content\") pod \"certified-operators-trrf6\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:17 crc kubenswrapper[4763]: I0930 14:57:17.929586 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-utilities\") pod \"certified-operators-trrf6\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:18 crc kubenswrapper[4763]: I0930 14:57:18.030635 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chzkn\" (UniqueName: \"kubernetes.io/projected/b9a03e08-83b7-4928-b892-02b35b33ab7b-kube-api-access-chzkn\") pod \"certified-operators-trrf6\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:18 crc kubenswrapper[4763]: I0930 14:57:18.030698 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-catalog-content\") pod \"certified-operators-trrf6\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:18 crc kubenswrapper[4763]: I0930 14:57:18.030717 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-utilities\") pod \"certified-operators-trrf6\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:18 crc kubenswrapper[4763]: I0930 14:57:18.031344 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-catalog-content\") pod \"certified-operators-trrf6\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:18 crc kubenswrapper[4763]: I0930 14:57:18.031422 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-utilities\") pod \"certified-operators-trrf6\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:18 crc kubenswrapper[4763]: I0930 14:57:18.063981 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chzkn\" (UniqueName: \"kubernetes.io/projected/b9a03e08-83b7-4928-b892-02b35b33ab7b-kube-api-access-chzkn\") pod \"certified-operators-trrf6\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:18 crc kubenswrapper[4763]: I0930 14:57:18.103212 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:18 crc kubenswrapper[4763]: I0930 14:57:18.615109 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-trrf6"] Sep 30 14:57:19 crc kubenswrapper[4763]: I0930 14:57:19.299374 4763 generic.go:334] "Generic (PLEG): container finished" podID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerID="314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5" exitCode=0 Sep 30 14:57:19 crc kubenswrapper[4763]: I0930 14:57:19.299440 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trrf6" event={"ID":"b9a03e08-83b7-4928-b892-02b35b33ab7b","Type":"ContainerDied","Data":"314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5"} Sep 30 14:57:19 crc kubenswrapper[4763]: I0930 14:57:19.299723 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trrf6" event={"ID":"b9a03e08-83b7-4928-b892-02b35b33ab7b","Type":"ContainerStarted","Data":"3ab44c86023376527525e3d098b8503afe07bae0a2456eafeb062abc5b010efe"} Sep 30 14:57:21 crc kubenswrapper[4763]: I0930 14:57:21.314941 4763 generic.go:334] "Generic (PLEG): container finished" podID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerID="907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197" exitCode=0 Sep 30 14:57:21 crc kubenswrapper[4763]: I0930 14:57:21.314978 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trrf6" event={"ID":"b9a03e08-83b7-4928-b892-02b35b33ab7b","Type":"ContainerDied","Data":"907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197"} Sep 30 14:57:22 crc kubenswrapper[4763]: I0930 14:57:22.324386 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trrf6" event={"ID":"b9a03e08-83b7-4928-b892-02b35b33ab7b","Type":"ContainerStarted","Data":"0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60"} Sep 30 14:57:22 crc kubenswrapper[4763]: I0930 14:57:22.342753 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-trrf6" podStartSLOduration=2.623613591 podStartE2EDuration="5.34273288s" podCreationTimestamp="2025-09-30 14:57:17 +0000 UTC" firstStartedPulling="2025-09-30 14:57:19.300890383 +0000 UTC m=+4911.439450668" lastFinishedPulling="2025-09-30 14:57:22.020009642 +0000 UTC m=+4914.158569957" observedRunningTime="2025-09-30 14:57:22.341738075 +0000 UTC m=+4914.480298360" watchObservedRunningTime="2025-09-30 14:57:22.34273288 +0000 UTC m=+4914.481293165" Sep 30 14:57:28 crc kubenswrapper[4763]: I0930 14:57:28.103571 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:28 crc kubenswrapper[4763]: I0930 14:57:28.103935 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:28 crc kubenswrapper[4763]: I0930 14:57:28.154451 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:28 crc kubenswrapper[4763]: I0930 14:57:28.404216 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:28 crc kubenswrapper[4763]: I0930 14:57:28.448866 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-trrf6"] Sep 30 14:57:30 crc kubenswrapper[4763]: I0930 14:57:30.377765 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-trrf6" podUID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerName="registry-server" containerID="cri-o://0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60" gracePeriod=2 Sep 30 14:57:30 crc kubenswrapper[4763]: I0930 14:57:30.828708 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:30 crc kubenswrapper[4763]: I0930 14:57:30.918827 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chzkn\" (UniqueName: \"kubernetes.io/projected/b9a03e08-83b7-4928-b892-02b35b33ab7b-kube-api-access-chzkn\") pod \"b9a03e08-83b7-4928-b892-02b35b33ab7b\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " Sep 30 14:57:30 crc kubenswrapper[4763]: I0930 14:57:30.919223 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-utilities\") pod \"b9a03e08-83b7-4928-b892-02b35b33ab7b\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " Sep 30 14:57:30 crc kubenswrapper[4763]: I0930 14:57:30.919865 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-catalog-content\") pod \"b9a03e08-83b7-4928-b892-02b35b33ab7b\" (UID: \"b9a03e08-83b7-4928-b892-02b35b33ab7b\") " Sep 30 14:57:30 crc kubenswrapper[4763]: I0930 14:57:30.920198 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-utilities" (OuterVolumeSpecName: "utilities") pod "b9a03e08-83b7-4928-b892-02b35b33ab7b" (UID: "b9a03e08-83b7-4928-b892-02b35b33ab7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:57:30 crc kubenswrapper[4763]: I0930 14:57:30.920478 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:57:30 crc kubenswrapper[4763]: I0930 14:57:30.924305 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a03e08-83b7-4928-b892-02b35b33ab7b-kube-api-access-chzkn" (OuterVolumeSpecName: "kube-api-access-chzkn") pod "b9a03e08-83b7-4928-b892-02b35b33ab7b" (UID: "b9a03e08-83b7-4928-b892-02b35b33ab7b"). InnerVolumeSpecName "kube-api-access-chzkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:57:30 crc kubenswrapper[4763]: I0930 14:57:30.969234 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9a03e08-83b7-4928-b892-02b35b33ab7b" (UID: "b9a03e08-83b7-4928-b892-02b35b33ab7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.021855 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chzkn\" (UniqueName: \"kubernetes.io/projected/b9a03e08-83b7-4928-b892-02b35b33ab7b-kube-api-access-chzkn\") on node \"crc\" DevicePath \"\"" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.021900 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a03e08-83b7-4928-b892-02b35b33ab7b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.392641 4763 generic.go:334] "Generic (PLEG): container finished" podID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerID="0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60" exitCode=0 Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.392691 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trrf6" event={"ID":"b9a03e08-83b7-4928-b892-02b35b33ab7b","Type":"ContainerDied","Data":"0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60"} Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.392719 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trrf6" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.392737 4763 scope.go:117] "RemoveContainer" containerID="0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.392722 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trrf6" event={"ID":"b9a03e08-83b7-4928-b892-02b35b33ab7b","Type":"ContainerDied","Data":"3ab44c86023376527525e3d098b8503afe07bae0a2456eafeb062abc5b010efe"} Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.413798 4763 scope.go:117] "RemoveContainer" containerID="907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.429356 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-trrf6"] Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.435928 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-trrf6"] Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.438119 4763 scope.go:117] "RemoveContainer" containerID="314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.472871 4763 scope.go:117] "RemoveContainer" containerID="0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60" Sep 30 14:57:31 crc kubenswrapper[4763]: E0930 14:57:31.473460 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60\": container with ID starting with 0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60 not found: ID does not exist" containerID="0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.473515 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60"} err="failed to get container status \"0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60\": rpc error: code = NotFound desc = could not find container \"0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60\": container with ID starting with 0c3cb5a450e3fa03c6d6ad4b9ec1ec77e26c5653100e46ca9526c9ce1b6c7c60 not found: ID does not exist" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.473541 4763 scope.go:117] "RemoveContainer" containerID="907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197" Sep 30 14:57:31 crc kubenswrapper[4763]: E0930 14:57:31.474084 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197\": container with ID starting with 907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197 not found: ID does not exist" containerID="907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.474174 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197"} err="failed to get container status \"907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197\": rpc error: code = NotFound desc = could not find container \"907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197\": container with ID starting with 907dfbffc8884a56cfaf0d22e947961b282fc50cd02176aa20c25f1d37dc5197 not found: ID does not exist" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.474248 4763 scope.go:117] "RemoveContainer" containerID="314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5" Sep 30 14:57:31 crc kubenswrapper[4763]: E0930 14:57:31.474732 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5\": container with ID starting with 314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5 not found: ID does not exist" containerID="314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5" Sep 30 14:57:31 crc kubenswrapper[4763]: I0930 14:57:31.474832 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5"} err="failed to get container status \"314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5\": rpc error: code = NotFound desc = could not find container \"314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5\": container with ID starting with 314de7dfd4501fad02223dcd1d5ce4f5ae405c70554b3f5106d171a739d3b9d5 not found: ID does not exist" Sep 30 14:57:32 crc kubenswrapper[4763]: I0930 14:57:32.500863 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a03e08-83b7-4928-b892-02b35b33ab7b" path="/var/lib/kubelet/pods/b9a03e08-83b7-4928-b892-02b35b33ab7b/volumes" Sep 30 14:57:36 crc kubenswrapper[4763]: I0930 14:57:36.060657 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:57:36 crc kubenswrapper[4763]: I0930 14:57:36.061243 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:57:36 crc kubenswrapper[4763]: I0930 14:57:36.061324 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 14:57:36 crc kubenswrapper[4763]: I0930 14:57:36.062463 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:57:36 crc kubenswrapper[4763]: I0930 14:57:36.063002 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" gracePeriod=600 Sep 30 14:57:36 crc kubenswrapper[4763]: E0930 14:57:36.183133 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:57:36 crc kubenswrapper[4763]: I0930 14:57:36.427801 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" exitCode=0 Sep 30 14:57:36 crc kubenswrapper[4763]: I0930 14:57:36.427847 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3"} Sep 30 14:57:36 crc kubenswrapper[4763]: I0930 14:57:36.427915 4763 scope.go:117] "RemoveContainer" containerID="4da597f3a8b25df538033d6e0e1f219426e990b064711c97664abf3092e45110" Sep 30 14:57:36 crc kubenswrapper[4763]: I0930 14:57:36.429048 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:57:36 crc kubenswrapper[4763]: E0930 14:57:36.429425 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:57:47 crc kubenswrapper[4763]: I0930 14:57:47.489802 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:57:47 crc kubenswrapper[4763]: E0930 14:57:47.490870 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:58:01 crc kubenswrapper[4763]: I0930 14:58:01.489756 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:58:01 crc kubenswrapper[4763]: E0930 14:58:01.490511 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:58:12 crc kubenswrapper[4763]: I0930 14:58:12.490864 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:58:12 crc kubenswrapper[4763]: E0930 14:58:12.491589 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:58:25 crc kubenswrapper[4763]: I0930 14:58:25.957079 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d987l"] Sep 30 14:58:25 crc kubenswrapper[4763]: E0930 14:58:25.958299 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerName="extract-content" Sep 30 14:58:25 crc kubenswrapper[4763]: I0930 14:58:25.958318 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerName="extract-content" Sep 30 14:58:25 crc kubenswrapper[4763]: E0930 14:58:25.958333 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerName="registry-server" Sep 30 14:58:25 crc kubenswrapper[4763]: I0930 14:58:25.958342 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerName="registry-server" Sep 30 14:58:25 crc kubenswrapper[4763]: E0930 14:58:25.958395 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerName="extract-utilities" Sep 30 14:58:25 crc kubenswrapper[4763]: I0930 14:58:25.958406 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerName="extract-utilities" Sep 30 14:58:25 crc kubenswrapper[4763]: I0930 14:58:25.958751 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a03e08-83b7-4928-b892-02b35b33ab7b" containerName="registry-server" Sep 30 14:58:25 crc kubenswrapper[4763]: I0930 14:58:25.960824 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:25 crc kubenswrapper[4763]: I0930 14:58:25.964968 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d987l"] Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.051510 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brjhw\" (UniqueName: \"kubernetes.io/projected/848ce3c4-a1ee-48f6-a387-341571572384-kube-api-access-brjhw\") pod \"community-operators-d987l\" (UID: \"848ce3c4-a1ee-48f6-a387-341571572384\") " pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.051592 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848ce3c4-a1ee-48f6-a387-341571572384-catalog-content\") pod \"community-operators-d987l\" (UID: \"848ce3c4-a1ee-48f6-a387-341571572384\") " pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.051712 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848ce3c4-a1ee-48f6-a387-341571572384-utilities\") pod \"community-operators-d987l\" (UID: \"848ce3c4-a1ee-48f6-a387-341571572384\") " pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.153470 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brjhw\" (UniqueName: \"kubernetes.io/projected/848ce3c4-a1ee-48f6-a387-341571572384-kube-api-access-brjhw\") pod \"community-operators-d987l\" (UID: \"848ce3c4-a1ee-48f6-a387-341571572384\") " pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.153539 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848ce3c4-a1ee-48f6-a387-341571572384-catalog-content\") pod \"community-operators-d987l\" (UID: \"848ce3c4-a1ee-48f6-a387-341571572384\") " pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.153584 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848ce3c4-a1ee-48f6-a387-341571572384-utilities\") pod \"community-operators-d987l\" (UID: \"848ce3c4-a1ee-48f6-a387-341571572384\") " pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.154113 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848ce3c4-a1ee-48f6-a387-341571572384-catalog-content\") pod \"community-operators-d987l\" (UID: \"848ce3c4-a1ee-48f6-a387-341571572384\") " pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.154162 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848ce3c4-a1ee-48f6-a387-341571572384-utilities\") pod \"community-operators-d987l\" (UID: \"848ce3c4-a1ee-48f6-a387-341571572384\") " pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.178543 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brjhw\" (UniqueName: \"kubernetes.io/projected/848ce3c4-a1ee-48f6-a387-341571572384-kube-api-access-brjhw\") pod \"community-operators-d987l\" (UID: \"848ce3c4-a1ee-48f6-a387-341571572384\") " pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.285656 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.489837 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:58:26 crc kubenswrapper[4763]: E0930 14:58:26.490287 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.765654 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d987l"] Sep 30 14:58:26 crc kubenswrapper[4763]: I0930 14:58:26.820459 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d987l" event={"ID":"848ce3c4-a1ee-48f6-a387-341571572384","Type":"ContainerStarted","Data":"b634a93adb8b3fd81e1dcb7c59677734ece76b075030e65be499b0f4f14a81d4"} Sep 30 14:58:27 crc kubenswrapper[4763]: I0930 14:58:27.830257 4763 generic.go:334] "Generic (PLEG): container finished" podID="848ce3c4-a1ee-48f6-a387-341571572384" containerID="669cd8c2cac36d423ccc3ed8a70a99c66f2cd3b66c7e13eb082a4a152b5b3dd4" exitCode=0 Sep 30 14:58:27 crc kubenswrapper[4763]: I0930 14:58:27.830362 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d987l" event={"ID":"848ce3c4-a1ee-48f6-a387-341571572384","Type":"ContainerDied","Data":"669cd8c2cac36d423ccc3ed8a70a99c66f2cd3b66c7e13eb082a4a152b5b3dd4"} Sep 30 14:58:27 crc kubenswrapper[4763]: I0930 14:58:27.832467 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:58:31 crc kubenswrapper[4763]: I0930 14:58:31.861427 4763 generic.go:334] "Generic (PLEG): container finished" podID="848ce3c4-a1ee-48f6-a387-341571572384" containerID="763c7198a47666898a4466240aed05ed2e0adf1f40ed2feaa6662b6929b49d2e" exitCode=0 Sep 30 14:58:31 crc kubenswrapper[4763]: I0930 14:58:31.861542 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d987l" event={"ID":"848ce3c4-a1ee-48f6-a387-341571572384","Type":"ContainerDied","Data":"763c7198a47666898a4466240aed05ed2e0adf1f40ed2feaa6662b6929b49d2e"} Sep 30 14:58:32 crc kubenswrapper[4763]: I0930 14:58:32.869903 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d987l" event={"ID":"848ce3c4-a1ee-48f6-a387-341571572384","Type":"ContainerStarted","Data":"fcedd1d7eb068822da6482fb7bee1bec24003f5c6797c7d1b4072d8714f46fc3"} Sep 30 14:58:32 crc kubenswrapper[4763]: I0930 14:58:32.891973 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d987l" podStartSLOduration=3.4745335 podStartE2EDuration="7.8919463s" podCreationTimestamp="2025-09-30 14:58:25 +0000 UTC" firstStartedPulling="2025-09-30 14:58:27.832179311 +0000 UTC m=+4979.970739606" lastFinishedPulling="2025-09-30 14:58:32.249592111 +0000 UTC m=+4984.388152406" observedRunningTime="2025-09-30 14:58:32.888509034 +0000 UTC m=+4985.027069379" watchObservedRunningTime="2025-09-30 14:58:32.8919463 +0000 UTC m=+4985.030506625" Sep 30 14:58:36 crc kubenswrapper[4763]: I0930 14:58:36.285983 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:36 crc kubenswrapper[4763]: I0930 14:58:36.286275 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:36 crc kubenswrapper[4763]: I0930 14:58:36.327998 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:37 crc kubenswrapper[4763]: I0930 14:58:37.489727 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:58:37 crc kubenswrapper[4763]: E0930 14:58:37.490045 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.337901 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d987l" Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.392839 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d987l"] Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.435346 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b6brb"] Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.435764 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b6brb" podUID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerName="registry-server" containerID="cri-o://0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4" gracePeriod=2 Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.844798 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6brb" Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.863180 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-utilities\") pod \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.863351 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqlkl\" (UniqueName: \"kubernetes.io/projected/adf74762-2792-4fe2-8ce5-e5e7c7f88469-kube-api-access-zqlkl\") pod \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.863400 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-catalog-content\") pod \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\" (UID: \"adf74762-2792-4fe2-8ce5-e5e7c7f88469\") " Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.864026 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-utilities" (OuterVolumeSpecName: "utilities") pod "adf74762-2792-4fe2-8ce5-e5e7c7f88469" (UID: "adf74762-2792-4fe2-8ce5-e5e7c7f88469"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.886557 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf74762-2792-4fe2-8ce5-e5e7c7f88469-kube-api-access-zqlkl" (OuterVolumeSpecName: "kube-api-access-zqlkl") pod "adf74762-2792-4fe2-8ce5-e5e7c7f88469" (UID: "adf74762-2792-4fe2-8ce5-e5e7c7f88469"). InnerVolumeSpecName "kube-api-access-zqlkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.919543 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adf74762-2792-4fe2-8ce5-e5e7c7f88469" (UID: "adf74762-2792-4fe2-8ce5-e5e7c7f88469"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.965572 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqlkl\" (UniqueName: \"kubernetes.io/projected/adf74762-2792-4fe2-8ce5-e5e7c7f88469-kube-api-access-zqlkl\") on node \"crc\" DevicePath \"\"" Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.965916 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.965930 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adf74762-2792-4fe2-8ce5-e5e7c7f88469-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.988398 4763 generic.go:334] "Generic (PLEG): container finished" podID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerID="0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4" exitCode=0 Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.988458 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6brb" Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.988484 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6brb" event={"ID":"adf74762-2792-4fe2-8ce5-e5e7c7f88469","Type":"ContainerDied","Data":"0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4"} Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.988536 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6brb" event={"ID":"adf74762-2792-4fe2-8ce5-e5e7c7f88469","Type":"ContainerDied","Data":"896814538cf8adff64302e6c7bc7d87dcbc0ef26731d607b9a39ef7ec3e13cbc"} Sep 30 14:58:46 crc kubenswrapper[4763]: I0930 14:58:46.988558 4763 scope.go:117] "RemoveContainer" containerID="0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4" Sep 30 14:58:47 crc kubenswrapper[4763]: I0930 14:58:47.008278 4763 scope.go:117] "RemoveContainer" containerID="cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1" Sep 30 14:58:47 crc kubenswrapper[4763]: I0930 14:58:47.015241 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b6brb"] Sep 30 14:58:47 crc kubenswrapper[4763]: I0930 14:58:47.023073 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b6brb"] Sep 30 14:58:47 crc kubenswrapper[4763]: I0930 14:58:47.046338 4763 scope.go:117] "RemoveContainer" containerID="fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d" Sep 30 14:58:47 crc kubenswrapper[4763]: I0930 14:58:47.064560 4763 scope.go:117] "RemoveContainer" containerID="0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4" Sep 30 14:58:47 crc kubenswrapper[4763]: E0930 14:58:47.065010 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4\": container with ID starting with 0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4 not found: ID does not exist" containerID="0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4" Sep 30 14:58:47 crc kubenswrapper[4763]: I0930 14:58:47.065054 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4"} err="failed to get container status \"0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4\": rpc error: code = NotFound desc = could not find container \"0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4\": container with ID starting with 0ba9951ff876b6ba6a6553c504da806b1e799f5d916e1d125f807544b1b2aaf4 not found: ID does not exist" Sep 30 14:58:47 crc kubenswrapper[4763]: I0930 14:58:47.065085 4763 scope.go:117] "RemoveContainer" containerID="cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1" Sep 30 14:58:47 crc kubenswrapper[4763]: E0930 14:58:47.065441 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1\": container with ID starting with cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1 not found: ID does not exist" containerID="cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1" Sep 30 14:58:47 crc kubenswrapper[4763]: I0930 14:58:47.065498 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1"} err="failed to get container status \"cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1\": rpc error: code = NotFound desc = could not find container \"cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1\": container with ID starting with cb135977a6e5293d136b9f36a81c7be98404f0808f755c5367c83efb5e1dfdc1 not found: ID does not exist" Sep 30 14:58:47 crc kubenswrapper[4763]: I0930 14:58:47.065530 4763 scope.go:117] "RemoveContainer" containerID="fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d" Sep 30 14:58:47 crc kubenswrapper[4763]: E0930 14:58:47.066043 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d\": container with ID starting with fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d not found: ID does not exist" containerID="fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d" Sep 30 14:58:47 crc kubenswrapper[4763]: I0930 14:58:47.066079 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d"} err="failed to get container status \"fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d\": rpc error: code = NotFound desc = could not find container \"fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d\": container with ID starting with fbcf4bc423f396ecc77b30cc70b1a9e25af31c1e3b6bf764c38fd4ee0818657d not found: ID does not exist" Sep 30 14:58:48 crc kubenswrapper[4763]: I0930 14:58:48.497257 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:58:48 crc kubenswrapper[4763]: E0930 14:58:48.497472 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:58:48 crc kubenswrapper[4763]: I0930 14:58:48.501753 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" path="/var/lib/kubelet/pods/adf74762-2792-4fe2-8ce5-e5e7c7f88469/volumes" Sep 30 14:59:00 crc kubenswrapper[4763]: I0930 14:59:00.488882 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:59:00 crc kubenswrapper[4763]: E0930 14:59:00.490901 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:59:14 crc kubenswrapper[4763]: I0930 14:59:14.489263 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:59:14 crc kubenswrapper[4763]: E0930 14:59:14.490052 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:59:29 crc kubenswrapper[4763]: I0930 14:59:29.489686 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:59:29 crc kubenswrapper[4763]: E0930 14:59:29.490442 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:59:43 crc kubenswrapper[4763]: I0930 14:59:43.489739 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:59:43 crc kubenswrapper[4763]: E0930 14:59:43.491366 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.246179 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Sep 30 14:59:55 crc kubenswrapper[4763]: E0930 14:59:55.246966 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerName="registry-server" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.246978 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerName="registry-server" Sep 30 14:59:55 crc kubenswrapper[4763]: E0930 14:59:55.247001 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerName="extract-utilities" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.247007 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerName="extract-utilities" Sep 30 14:59:55 crc kubenswrapper[4763]: E0930 14:59:55.247029 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerName="extract-content" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.247036 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerName="extract-content" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.247179 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf74762-2792-4fe2-8ce5-e5e7c7f88469" containerName="registry-server" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.247728 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.253098 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rzn4l" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.255634 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.319735 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c7e0fd8e-98b0-4e35-88b2-47d0e7d96b43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7e0fd8e-98b0-4e35-88b2-47d0e7d96b43\") pod \"mariadb-copy-data\" (UID: \"67af32f0-7954-4054-a5c0-cdb6da84d408\") " pod="openstack/mariadb-copy-data" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.320086 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zbrr\" (UniqueName: \"kubernetes.io/projected/67af32f0-7954-4054-a5c0-cdb6da84d408-kube-api-access-2zbrr\") pod \"mariadb-copy-data\" (UID: \"67af32f0-7954-4054-a5c0-cdb6da84d408\") " pod="openstack/mariadb-copy-data" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.422269 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zbrr\" (UniqueName: \"kubernetes.io/projected/67af32f0-7954-4054-a5c0-cdb6da84d408-kube-api-access-2zbrr\") pod \"mariadb-copy-data\" (UID: \"67af32f0-7954-4054-a5c0-cdb6da84d408\") " pod="openstack/mariadb-copy-data" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.422735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c7e0fd8e-98b0-4e35-88b2-47d0e7d96b43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7e0fd8e-98b0-4e35-88b2-47d0e7d96b43\") pod \"mariadb-copy-data\" (UID: \"67af32f0-7954-4054-a5c0-cdb6da84d408\") " pod="openstack/mariadb-copy-data" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.425958 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.425996 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c7e0fd8e-98b0-4e35-88b2-47d0e7d96b43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7e0fd8e-98b0-4e35-88b2-47d0e7d96b43\") pod \"mariadb-copy-data\" (UID: \"67af32f0-7954-4054-a5c0-cdb6da84d408\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9fa8aeab36302e450399d74fc071df500c123b539b6b285dcfa5be6bee25f768/globalmount\"" pod="openstack/mariadb-copy-data" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.443688 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zbrr\" (UniqueName: \"kubernetes.io/projected/67af32f0-7954-4054-a5c0-cdb6da84d408-kube-api-access-2zbrr\") pod \"mariadb-copy-data\" (UID: \"67af32f0-7954-4054-a5c0-cdb6da84d408\") " pod="openstack/mariadb-copy-data" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.458331 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c7e0fd8e-98b0-4e35-88b2-47d0e7d96b43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7e0fd8e-98b0-4e35-88b2-47d0e7d96b43\") pod \"mariadb-copy-data\" (UID: \"67af32f0-7954-4054-a5c0-cdb6da84d408\") " pod="openstack/mariadb-copy-data" Sep 30 14:59:55 crc kubenswrapper[4763]: I0930 14:59:55.576081 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Sep 30 14:59:56 crc kubenswrapper[4763]: I0930 14:59:56.067375 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Sep 30 14:59:56 crc kubenswrapper[4763]: I0930 14:59:56.500441 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"67af32f0-7954-4054-a5c0-cdb6da84d408","Type":"ContainerStarted","Data":"a972948f2a3cc8ea6703e5d21d4fc0eda521c650adf5c30532e7cec0956a0a7f"} Sep 30 14:59:56 crc kubenswrapper[4763]: I0930 14:59:56.500481 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"67af32f0-7954-4054-a5c0-cdb6da84d408","Type":"ContainerStarted","Data":"2740feec80b9c511b74d343f61323ac40a6a7bf25f0e3f022ec0970f1590928a"} Sep 30 14:59:56 crc kubenswrapper[4763]: I0930 14:59:56.513845 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.513826504 podStartE2EDuration="2.513826504s" podCreationTimestamp="2025-09-30 14:59:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:59:56.507034835 +0000 UTC m=+5068.645595150" watchObservedRunningTime="2025-09-30 14:59:56.513826504 +0000 UTC m=+5068.652386789" Sep 30 14:59:58 crc kubenswrapper[4763]: I0930 14:59:58.287459 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Sep 30 14:59:58 crc kubenswrapper[4763]: I0930 14:59:58.291287 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 30 14:59:58 crc kubenswrapper[4763]: I0930 14:59:58.302397 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Sep 30 14:59:58 crc kubenswrapper[4763]: I0930 14:59:58.469327 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwl4q\" (UniqueName: \"kubernetes.io/projected/3ac64add-a06d-4cc5-8dbf-84f22be37b72-kube-api-access-zwl4q\") pod \"mariadb-client\" (UID: \"3ac64add-a06d-4cc5-8dbf-84f22be37b72\") " pod="openstack/mariadb-client" Sep 30 14:59:58 crc kubenswrapper[4763]: I0930 14:59:58.494548 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 14:59:58 crc kubenswrapper[4763]: E0930 14:59:58.494853 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 14:59:58 crc kubenswrapper[4763]: I0930 14:59:58.571278 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwl4q\" (UniqueName: \"kubernetes.io/projected/3ac64add-a06d-4cc5-8dbf-84f22be37b72-kube-api-access-zwl4q\") pod \"mariadb-client\" (UID: \"3ac64add-a06d-4cc5-8dbf-84f22be37b72\") " pod="openstack/mariadb-client" Sep 30 14:59:58 crc kubenswrapper[4763]: I0930 14:59:58.596081 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwl4q\" (UniqueName: \"kubernetes.io/projected/3ac64add-a06d-4cc5-8dbf-84f22be37b72-kube-api-access-zwl4q\") pod \"mariadb-client\" (UID: \"3ac64add-a06d-4cc5-8dbf-84f22be37b72\") " pod="openstack/mariadb-client" Sep 30 14:59:58 crc kubenswrapper[4763]: I0930 14:59:58.620250 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 30 14:59:59 crc kubenswrapper[4763]: I0930 14:59:59.095026 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Sep 30 14:59:59 crc kubenswrapper[4763]: I0930 14:59:59.518802 4763 generic.go:334] "Generic (PLEG): container finished" podID="3ac64add-a06d-4cc5-8dbf-84f22be37b72" containerID="08e4ef0ff1c31b2201a8788992884ff64f6030d207f2f9fbceb7eb99ca10448f" exitCode=0 Sep 30 14:59:59 crc kubenswrapper[4763]: I0930 14:59:59.519066 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"3ac64add-a06d-4cc5-8dbf-84f22be37b72","Type":"ContainerDied","Data":"08e4ef0ff1c31b2201a8788992884ff64f6030d207f2f9fbceb7eb99ca10448f"} Sep 30 14:59:59 crc kubenswrapper[4763]: I0930 14:59:59.520359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"3ac64add-a06d-4cc5-8dbf-84f22be37b72","Type":"ContainerStarted","Data":"a46c107e14dd1f2741a380ebdad529875519c4b0372e2a2d451e371fc545b567"} Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.142423 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t"] Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.143692 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.146225 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.146255 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.152290 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t"] Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.301367 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjh4x\" (UniqueName: \"kubernetes.io/projected/8ea5ea06-98fb-4192-b677-805f1c620e81-kube-api-access-sjh4x\") pod \"collect-profiles-29320740-2x86t\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.301819 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ea5ea06-98fb-4192-b677-805f1c620e81-secret-volume\") pod \"collect-profiles-29320740-2x86t\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.301875 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ea5ea06-98fb-4192-b677-805f1c620e81-config-volume\") pod \"collect-profiles-29320740-2x86t\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.402862 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjh4x\" (UniqueName: \"kubernetes.io/projected/8ea5ea06-98fb-4192-b677-805f1c620e81-kube-api-access-sjh4x\") pod \"collect-profiles-29320740-2x86t\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.403105 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ea5ea06-98fb-4192-b677-805f1c620e81-secret-volume\") pod \"collect-profiles-29320740-2x86t\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.403735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ea5ea06-98fb-4192-b677-805f1c620e81-config-volume\") pod \"collect-profiles-29320740-2x86t\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.404668 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ea5ea06-98fb-4192-b677-805f1c620e81-config-volume\") pod \"collect-profiles-29320740-2x86t\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.409383 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ea5ea06-98fb-4192-b677-805f1c620e81-secret-volume\") pod \"collect-profiles-29320740-2x86t\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.422665 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjh4x\" (UniqueName: \"kubernetes.io/projected/8ea5ea06-98fb-4192-b677-805f1c620e81-kube-api-access-sjh4x\") pod \"collect-profiles-29320740-2x86t\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.467788 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.806804 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.834846 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_3ac64add-a06d-4cc5-8dbf-84f22be37b72/mariadb-client/0.log" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.863236 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.869587 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.911800 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwl4q\" (UniqueName: \"kubernetes.io/projected/3ac64add-a06d-4cc5-8dbf-84f22be37b72-kube-api-access-zwl4q\") pod \"3ac64add-a06d-4cc5-8dbf-84f22be37b72\" (UID: \"3ac64add-a06d-4cc5-8dbf-84f22be37b72\") " Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.916095 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac64add-a06d-4cc5-8dbf-84f22be37b72-kube-api-access-zwl4q" (OuterVolumeSpecName: "kube-api-access-zwl4q") pod "3ac64add-a06d-4cc5-8dbf-84f22be37b72" (UID: "3ac64add-a06d-4cc5-8dbf-84f22be37b72"). InnerVolumeSpecName "kube-api-access-zwl4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.963216 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t"] Sep 30 15:00:00 crc kubenswrapper[4763]: W0930 15:00:00.968977 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea5ea06_98fb_4192_b677_805f1c620e81.slice/crio-e99d98e976837d110c372f5e969f00bf6abebb02ef2a3e0f2404ce0e32345bd3 WatchSource:0}: Error finding container e99d98e976837d110c372f5e969f00bf6abebb02ef2a3e0f2404ce0e32345bd3: Status 404 returned error can't find the container with id e99d98e976837d110c372f5e969f00bf6abebb02ef2a3e0f2404ce0e32345bd3 Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.988096 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Sep 30 15:00:00 crc kubenswrapper[4763]: E0930 15:00:00.988506 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac64add-a06d-4cc5-8dbf-84f22be37b72" containerName="mariadb-client" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.988527 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac64add-a06d-4cc5-8dbf-84f22be37b72" containerName="mariadb-client" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.988748 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac64add-a06d-4cc5-8dbf-84f22be37b72" containerName="mariadb-client" Sep 30 15:00:00 crc kubenswrapper[4763]: I0930 15:00:00.989351 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.004233 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.020351 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwl4q\" (UniqueName: \"kubernetes.io/projected/3ac64add-a06d-4cc5-8dbf-84f22be37b72-kube-api-access-zwl4q\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.122425 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvss5\" (UniqueName: \"kubernetes.io/projected/25387383-f43e-4bf3-a40f-ea7da073e853-kube-api-access-zvss5\") pod \"mariadb-client\" (UID: \"25387383-f43e-4bf3-a40f-ea7da073e853\") " pod="openstack/mariadb-client" Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.223765 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvss5\" (UniqueName: \"kubernetes.io/projected/25387383-f43e-4bf3-a40f-ea7da073e853-kube-api-access-zvss5\") pod \"mariadb-client\" (UID: \"25387383-f43e-4bf3-a40f-ea7da073e853\") " pod="openstack/mariadb-client" Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.245365 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvss5\" (UniqueName: \"kubernetes.io/projected/25387383-f43e-4bf3-a40f-ea7da073e853-kube-api-access-zvss5\") pod \"mariadb-client\" (UID: \"25387383-f43e-4bf3-a40f-ea7da073e853\") " pod="openstack/mariadb-client" Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.315499 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.536336 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a46c107e14dd1f2741a380ebdad529875519c4b0372e2a2d451e371fc545b567" Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.536414 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.538005 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ea5ea06-98fb-4192-b677-805f1c620e81" containerID="20f225b0d25ad0fe8dc67946c343a60f730b493cd344264f92d9cae89b5f9be5" exitCode=0 Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.538053 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" event={"ID":"8ea5ea06-98fb-4192-b677-805f1c620e81","Type":"ContainerDied","Data":"20f225b0d25ad0fe8dc67946c343a60f730b493cd344264f92d9cae89b5f9be5"} Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.538083 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" event={"ID":"8ea5ea06-98fb-4192-b677-805f1c620e81","Type":"ContainerStarted","Data":"e99d98e976837d110c372f5e969f00bf6abebb02ef2a3e0f2404ce0e32345bd3"} Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.562493 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="3ac64add-a06d-4cc5-8dbf-84f22be37b72" podUID="25387383-f43e-4bf3-a40f-ea7da073e853" Sep 30 15:00:01 crc kubenswrapper[4763]: I0930 15:00:01.785052 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Sep 30 15:00:01 crc kubenswrapper[4763]: W0930 15:00:01.791661 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25387383_f43e_4bf3_a40f_ea7da073e853.slice/crio-f50ea2aed9a0b0261564879bf0d77bf7322353e933dc777d19b2d88c471252fa WatchSource:0}: Error finding container f50ea2aed9a0b0261564879bf0d77bf7322353e933dc777d19b2d88c471252fa: Status 404 returned error can't find the container with id f50ea2aed9a0b0261564879bf0d77bf7322353e933dc777d19b2d88c471252fa Sep 30 15:00:02 crc kubenswrapper[4763]: I0930 15:00:02.501009 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac64add-a06d-4cc5-8dbf-84f22be37b72" path="/var/lib/kubelet/pods/3ac64add-a06d-4cc5-8dbf-84f22be37b72/volumes" Sep 30 15:00:02 crc kubenswrapper[4763]: I0930 15:00:02.546254 4763 generic.go:334] "Generic (PLEG): container finished" podID="25387383-f43e-4bf3-a40f-ea7da073e853" containerID="25709f5e4ef350d77d6eb0d633f1677235c3f99f610278ac4d3a8524462278d6" exitCode=0 Sep 30 15:00:02 crc kubenswrapper[4763]: I0930 15:00:02.546831 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"25387383-f43e-4bf3-a40f-ea7da073e853","Type":"ContainerDied","Data":"25709f5e4ef350d77d6eb0d633f1677235c3f99f610278ac4d3a8524462278d6"} Sep 30 15:00:02 crc kubenswrapper[4763]: I0930 15:00:02.546858 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"25387383-f43e-4bf3-a40f-ea7da073e853","Type":"ContainerStarted","Data":"f50ea2aed9a0b0261564879bf0d77bf7322353e933dc777d19b2d88c471252fa"} Sep 30 15:00:02 crc kubenswrapper[4763]: I0930 15:00:02.878572 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.059518 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ea5ea06-98fb-4192-b677-805f1c620e81-config-volume\") pod \"8ea5ea06-98fb-4192-b677-805f1c620e81\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.059660 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjh4x\" (UniqueName: \"kubernetes.io/projected/8ea5ea06-98fb-4192-b677-805f1c620e81-kube-api-access-sjh4x\") pod \"8ea5ea06-98fb-4192-b677-805f1c620e81\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.059691 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ea5ea06-98fb-4192-b677-805f1c620e81-secret-volume\") pod \"8ea5ea06-98fb-4192-b677-805f1c620e81\" (UID: \"8ea5ea06-98fb-4192-b677-805f1c620e81\") " Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.060785 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ea5ea06-98fb-4192-b677-805f1c620e81-config-volume" (OuterVolumeSpecName: "config-volume") pod "8ea5ea06-98fb-4192-b677-805f1c620e81" (UID: "8ea5ea06-98fb-4192-b677-805f1c620e81"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.069897 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea5ea06-98fb-4192-b677-805f1c620e81-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8ea5ea06-98fb-4192-b677-805f1c620e81" (UID: "8ea5ea06-98fb-4192-b677-805f1c620e81"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.071057 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea5ea06-98fb-4192-b677-805f1c620e81-kube-api-access-sjh4x" (OuterVolumeSpecName: "kube-api-access-sjh4x") pod "8ea5ea06-98fb-4192-b677-805f1c620e81" (UID: "8ea5ea06-98fb-4192-b677-805f1c620e81"). InnerVolumeSpecName "kube-api-access-sjh4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.160896 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ea5ea06-98fb-4192-b677-805f1c620e81-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.160937 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjh4x\" (UniqueName: \"kubernetes.io/projected/8ea5ea06-98fb-4192-b677-805f1c620e81-kube-api-access-sjh4x\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.160948 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ea5ea06-98fb-4192-b677-805f1c620e81-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.555789 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.555782 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-2x86t" event={"ID":"8ea5ea06-98fb-4192-b677-805f1c620e81","Type":"ContainerDied","Data":"e99d98e976837d110c372f5e969f00bf6abebb02ef2a3e0f2404ce0e32345bd3"} Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.555845 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e99d98e976837d110c372f5e969f00bf6abebb02ef2a3e0f2404ce0e32345bd3" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.852958 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.872860 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_25387383-f43e-4bf3-a40f-ea7da073e853/mariadb-client/0.log" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.872973 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvss5\" (UniqueName: \"kubernetes.io/projected/25387383-f43e-4bf3-a40f-ea7da073e853-kube-api-access-zvss5\") pod \"25387383-f43e-4bf3-a40f-ea7da073e853\" (UID: \"25387383-f43e-4bf3-a40f-ea7da073e853\") " Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.902356 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.910535 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.911786 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25387383-f43e-4bf3-a40f-ea7da073e853-kube-api-access-zvss5" (OuterVolumeSpecName: "kube-api-access-zvss5") pod "25387383-f43e-4bf3-a40f-ea7da073e853" (UID: "25387383-f43e-4bf3-a40f-ea7da073e853"). InnerVolumeSpecName "kube-api-access-zvss5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.964795 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh"] Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.972324 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-9h2wh"] Sep 30 15:00:03 crc kubenswrapper[4763]: I0930 15:00:03.975317 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvss5\" (UniqueName: \"kubernetes.io/projected/25387383-f43e-4bf3-a40f-ea7da073e853-kube-api-access-zvss5\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:04 crc kubenswrapper[4763]: I0930 15:00:04.503944 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25387383-f43e-4bf3-a40f-ea7da073e853" path="/var/lib/kubelet/pods/25387383-f43e-4bf3-a40f-ea7da073e853/volumes" Sep 30 15:00:04 crc kubenswrapper[4763]: I0930 15:00:04.505513 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d1349e-ebb0-4e6c-96db-7b27f9a56494" path="/var/lib/kubelet/pods/57d1349e-ebb0-4e6c-96db-7b27f9a56494/volumes" Sep 30 15:00:04 crc kubenswrapper[4763]: I0930 15:00:04.567480 4763 scope.go:117] "RemoveContainer" containerID="25709f5e4ef350d77d6eb0d633f1677235c3f99f610278ac4d3a8524462278d6" Sep 30 15:00:04 crc kubenswrapper[4763]: I0930 15:00:04.567622 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 30 15:00:12 crc kubenswrapper[4763]: I0930 15:00:12.489875 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:00:12 crc kubenswrapper[4763]: E0930 15:00:12.490639 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:00:25 crc kubenswrapper[4763]: I0930 15:00:25.489213 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:00:25 crc kubenswrapper[4763]: E0930 15:00:25.489826 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:00:34 crc kubenswrapper[4763]: I0930 15:00:34.246723 4763 scope.go:117] "RemoveContainer" containerID="0bee90b678e8e4caae0c8b6e7d19ba366b4649145f9838d7b6b5b4170c014b0d" Sep 30 15:00:38 crc kubenswrapper[4763]: I0930 15:00:38.494477 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:00:38 crc kubenswrapper[4763]: E0930 15:00:38.494937 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:00:52 crc kubenswrapper[4763]: I0930 15:00:52.504319 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:00:52 crc kubenswrapper[4763]: E0930 15:00:52.505162 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:01:07 crc kubenswrapper[4763]: I0930 15:01:07.489878 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:01:07 crc kubenswrapper[4763]: E0930 15:01:07.490863 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:01:19 crc kubenswrapper[4763]: I0930 15:01:19.489412 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:01:19 crc kubenswrapper[4763]: E0930 15:01:19.490499 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:01:31 crc kubenswrapper[4763]: I0930 15:01:31.490230 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:01:31 crc kubenswrapper[4763]: E0930 15:01:31.490981 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:01:34 crc kubenswrapper[4763]: I0930 15:01:34.310555 4763 scope.go:117] "RemoveContainer" containerID="e2b6dc7aa26f01802750fd17600bee09fcc81abea89768200dd79a5dea3a3b62" Sep 30 15:01:34 crc kubenswrapper[4763]: I0930 15:01:34.332306 4763 scope.go:117] "RemoveContainer" containerID="f60c3d8bd26c0408138a15217c0cfc50b613532deff50869d04851947ddaa60f" Sep 30 15:01:34 crc kubenswrapper[4763]: I0930 15:01:34.372676 4763 scope.go:117] "RemoveContainer" containerID="e065e5acbaaa623a8c7f093d617647bf9cce0e7f959da54c1163c5fe6f05d147" Sep 30 15:01:34 crc kubenswrapper[4763]: I0930 15:01:34.398922 4763 scope.go:117] "RemoveContainer" containerID="80c354a53c4ebc7bf80ad72e734b8cd2bf43bbb044abc71797ed3da2ee1e6448" Sep 30 15:01:34 crc kubenswrapper[4763]: I0930 15:01:34.424529 4763 scope.go:117] "RemoveContainer" containerID="9b928cce12141f53b983ad55837a56b815109996c9286159819152492255b8a8" Sep 30 15:01:34 crc kubenswrapper[4763]: I0930 15:01:34.454030 4763 scope.go:117] "RemoveContainer" containerID="f88b90e5d60bd578d52a548ee890b8d8c8a00a4f9c0f6cf2f86d456a1c0eef2c" Sep 30 15:01:34 crc kubenswrapper[4763]: I0930 15:01:34.484200 4763 scope.go:117] "RemoveContainer" containerID="fbe318bf1311ce7556c6e21fea49f6e875dcfac157ac9c02f2449116aae6120a" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.490528 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:01:46 crc kubenswrapper[4763]: E0930 15:01:46.491414 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.887858 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 15:01:46 crc kubenswrapper[4763]: E0930 15:01:46.888328 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea5ea06-98fb-4192-b677-805f1c620e81" containerName="collect-profiles" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.888351 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea5ea06-98fb-4192-b677-805f1c620e81" containerName="collect-profiles" Sep 30 15:01:46 crc kubenswrapper[4763]: E0930 15:01:46.888390 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25387383-f43e-4bf3-a40f-ea7da073e853" containerName="mariadb-client" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.888408 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="25387383-f43e-4bf3-a40f-ea7da073e853" containerName="mariadb-client" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.890389 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ea5ea06-98fb-4192-b677-805f1c620e81" containerName="collect-profiles" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.890456 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="25387383-f43e-4bf3-a40f-ea7da073e853" containerName="mariadb-client" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.891580 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.893282 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.893565 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.894192 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-w9rdb" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.896474 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.907119 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.908680 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.910494 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.915176 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.920660 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Sep 30 15:01:46 crc kubenswrapper[4763]: I0930 15:01:46.927196 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.006228 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b82f57c-bca5-4fad-949d-13d9fdf45d62-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.006283 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b82f57c-bca5-4fad-949d-13d9fdf45d62-config\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.006319 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b20faf9-26ae-48bf-9293-541e5b2c3468-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.006430 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b20faf9-26ae-48bf-9293-541e5b2c3468-config\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.006524 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b82f57c-bca5-4fad-949d-13d9fdf45d62-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.006551 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b20faf9-26ae-48bf-9293-541e5b2c3468-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.006620 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br4cz\" (UniqueName: \"kubernetes.io/projected/7b20faf9-26ae-48bf-9293-541e5b2c3468-kube-api-access-br4cz\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.006742 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b20faf9-26ae-48bf-9293-541e5b2c3468-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.006801 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b82f57c-bca5-4fad-949d-13d9fdf45d62-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.006863 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h64kx\" (UniqueName: \"kubernetes.io/projected/2b82f57c-bca5-4fad-949d-13d9fdf45d62-kube-api-access-h64kx\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.007030 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-df11c5d1-e667-4f2b-86eb-03d0c8e2948b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df11c5d1-e667-4f2b-86eb-03d0c8e2948b\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.007130 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c26b4f5e-de46-486d-a956-46dc3c03efea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c26b4f5e-de46-486d-a956-46dc3c03efea\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.083499 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.085222 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.090947 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.091004 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.091174 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-k75r6" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.094144 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.108403 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109021 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b20faf9-26ae-48bf-9293-541e5b2c3468-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109069 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d44ff345-47d2-4e11-bb92-1b3e00eaba74-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109101 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b20faf9-26ae-48bf-9293-541e5b2c3468-config\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109133 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-239266ac-2a4e-4576-9e86-f878ec6c4e52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-239266ac-2a4e-4576-9e86-f878ec6c4e52\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109166 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d44ff345-47d2-4e11-bb92-1b3e00eaba74-config\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109190 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d44ff345-47d2-4e11-bb92-1b3e00eaba74-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109214 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b82f57c-bca5-4fad-949d-13d9fdf45d62-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109240 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b20faf9-26ae-48bf-9293-541e5b2c3468-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109262 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br4cz\" (UniqueName: \"kubernetes.io/projected/7b20faf9-26ae-48bf-9293-541e5b2c3468-kube-api-access-br4cz\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109294 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b20faf9-26ae-48bf-9293-541e5b2c3468-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109319 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b82f57c-bca5-4fad-949d-13d9fdf45d62-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109346 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h64kx\" (UniqueName: \"kubernetes.io/projected/2b82f57c-bca5-4fad-949d-13d9fdf45d62-kube-api-access-h64kx\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109406 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44ff345-47d2-4e11-bb92-1b3e00eaba74-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109444 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-df11c5d1-e667-4f2b-86eb-03d0c8e2948b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df11c5d1-e667-4f2b-86eb-03d0c8e2948b\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109478 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c26b4f5e-de46-486d-a956-46dc3c03efea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c26b4f5e-de46-486d-a956-46dc3c03efea\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109516 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdkm\" (UniqueName: \"kubernetes.io/projected/d44ff345-47d2-4e11-bb92-1b3e00eaba74-kube-api-access-9bdkm\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109545 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b82f57c-bca5-4fad-949d-13d9fdf45d62-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.109569 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b82f57c-bca5-4fad-949d-13d9fdf45d62-config\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.110110 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.110500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b82f57c-bca5-4fad-949d-13d9fdf45d62-config\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.113128 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b20faf9-26ae-48bf-9293-541e5b2c3468-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.113628 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b20faf9-26ae-48bf-9293-541e5b2c3468-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.113848 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.113990 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b82f57c-bca5-4fad-949d-13d9fdf45d62-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.115440 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.115959 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b82f57c-bca5-4fad-949d-13d9fdf45d62-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.126713 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.126729 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b20faf9-26ae-48bf-9293-541e5b2c3468-config\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.127377 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.127416 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c26b4f5e-de46-486d-a956-46dc3c03efea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c26b4f5e-de46-486d-a956-46dc3c03efea\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4cccc1c46b947d869d74a6c0f4e7f14b0b39237bce0a27c372d5f7b42d326fb7/globalmount\"" pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.129378 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b82f57c-bca5-4fad-949d-13d9fdf45d62-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.130550 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.133299 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b20faf9-26ae-48bf-9293-541e5b2c3468-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.138669 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.138718 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-df11c5d1-e667-4f2b-86eb-03d0c8e2948b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df11c5d1-e667-4f2b-86eb-03d0c8e2948b\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e57f7013882cb3c4efb969c424795b20b5ec2c5683d4c73c7cf265b90505c60e/globalmount\"" pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.139307 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h64kx\" (UniqueName: \"kubernetes.io/projected/2b82f57c-bca5-4fad-949d-13d9fdf45d62-kube-api-access-h64kx\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.147399 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br4cz\" (UniqueName: \"kubernetes.io/projected/7b20faf9-26ae-48bf-9293-541e5b2c3468-kube-api-access-br4cz\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.182448 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-df11c5d1-e667-4f2b-86eb-03d0c8e2948b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df11c5d1-e667-4f2b-86eb-03d0c8e2948b\") pod \"ovsdbserver-nb-0\" (UID: \"2b82f57c-bca5-4fad-949d-13d9fdf45d62\") " pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.183205 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c26b4f5e-de46-486d-a956-46dc3c03efea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c26b4f5e-de46-486d-a956-46dc3c03efea\") pod \"ovsdbserver-nb-1\" (UID: \"7b20faf9-26ae-48bf-9293-541e5b2c3468\") " pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.211548 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-239266ac-2a4e-4576-9e86-f878ec6c4e52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-239266ac-2a4e-4576-9e86-f878ec6c4e52\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.211633 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1252ad03-af5a-458c-a660-74b5389d2f50-config\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.211669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d44ff345-47d2-4e11-bb92-1b3e00eaba74-config\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.211691 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9670da-68b7-4ec2-ada3-51c74cabd937-config\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.211720 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d44ff345-47d2-4e11-bb92-1b3e00eaba74-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.211740 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/746fc1ac-b115-4114-a611-4b2c18e779d3-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.211763 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/746fc1ac-b115-4114-a611-4b2c18e779d3-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.211943 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1252ad03-af5a-458c-a660-74b5389d2f50-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212008 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-620553e4-8453-474f-9db7-85fe4cc4842e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-620553e4-8453-474f-9db7-85fe4cc4842e\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44ff345-47d2-4e11-bb92-1b3e00eaba74-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212092 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crmdb\" (UniqueName: \"kubernetes.io/projected/1252ad03-af5a-458c-a660-74b5389d2f50-kube-api-access-crmdb\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212129 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e9670da-68b7-4ec2-ada3-51c74cabd937-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212163 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1252ad03-af5a-458c-a660-74b5389d2f50-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212189 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-50b3878a-4497-4564-9702-89e7fb77678b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b3878a-4497-4564-9702-89e7fb77678b\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212324 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fc1ac-b115-4114-a611-4b2c18e779d3-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212360 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bdkm\" (UniqueName: \"kubernetes.io/projected/d44ff345-47d2-4e11-bb92-1b3e00eaba74-kube-api-access-9bdkm\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212385 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-76bcfd0a-3c22-4c05-9ff1-219b49363978\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-76bcfd0a-3c22-4c05-9ff1-219b49363978\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212414 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9670da-68b7-4ec2-ada3-51c74cabd937-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212453 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcnw9\" (UniqueName: \"kubernetes.io/projected/0e9670da-68b7-4ec2-ada3-51c74cabd937-kube-api-access-kcnw9\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212477 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/746fc1ac-b115-4114-a611-4b2c18e779d3-config\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212499 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e9670da-68b7-4ec2-ada3-51c74cabd937-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212524 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbxrt\" (UniqueName: \"kubernetes.io/projected/746fc1ac-b115-4114-a611-4b2c18e779d3-kube-api-access-xbxrt\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212552 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d44ff345-47d2-4e11-bb92-1b3e00eaba74-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212662 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d44ff345-47d2-4e11-bb92-1b3e00eaba74-config\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.212687 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1252ad03-af5a-458c-a660-74b5389d2f50-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.213061 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d44ff345-47d2-4e11-bb92-1b3e00eaba74-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.213070 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d44ff345-47d2-4e11-bb92-1b3e00eaba74-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.215923 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.215968 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-239266ac-2a4e-4576-9e86-f878ec6c4e52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-239266ac-2a4e-4576-9e86-f878ec6c4e52\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/49de12c52bb4f8ffb34d0ee9920d1168613545ccbec072907b81bee5e6d32d18/globalmount\"" pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.216683 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44ff345-47d2-4e11-bb92-1b3e00eaba74-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.227670 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bdkm\" (UniqueName: \"kubernetes.io/projected/d44ff345-47d2-4e11-bb92-1b3e00eaba74-kube-api-access-9bdkm\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.236023 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.244194 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-239266ac-2a4e-4576-9e86-f878ec6c4e52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-239266ac-2a4e-4576-9e86-f878ec6c4e52\") pod \"ovsdbserver-nb-2\" (UID: \"d44ff345-47d2-4e11-bb92-1b3e00eaba74\") " pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.249090 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.262418 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.314730 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-620553e4-8453-474f-9db7-85fe4cc4842e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-620553e4-8453-474f-9db7-85fe4cc4842e\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315100 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crmdb\" (UniqueName: \"kubernetes.io/projected/1252ad03-af5a-458c-a660-74b5389d2f50-kube-api-access-crmdb\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e9670da-68b7-4ec2-ada3-51c74cabd937-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315174 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1252ad03-af5a-458c-a660-74b5389d2f50-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315203 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-50b3878a-4497-4564-9702-89e7fb77678b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b3878a-4497-4564-9702-89e7fb77678b\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315227 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fc1ac-b115-4114-a611-4b2c18e779d3-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315256 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-76bcfd0a-3c22-4c05-9ff1-219b49363978\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-76bcfd0a-3c22-4c05-9ff1-219b49363978\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315282 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9670da-68b7-4ec2-ada3-51c74cabd937-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315314 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcnw9\" (UniqueName: \"kubernetes.io/projected/0e9670da-68b7-4ec2-ada3-51c74cabd937-kube-api-access-kcnw9\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e9670da-68b7-4ec2-ada3-51c74cabd937-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315357 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/746fc1ac-b115-4114-a611-4b2c18e779d3-config\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.315383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbxrt\" (UniqueName: \"kubernetes.io/projected/746fc1ac-b115-4114-a611-4b2c18e779d3-kube-api-access-xbxrt\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.316119 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1252ad03-af5a-458c-a660-74b5389d2f50-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.316218 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1252ad03-af5a-458c-a660-74b5389d2f50-config\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.316255 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9670da-68b7-4ec2-ada3-51c74cabd937-config\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.316277 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/746fc1ac-b115-4114-a611-4b2c18e779d3-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.316327 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/746fc1ac-b115-4114-a611-4b2c18e779d3-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.317131 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1252ad03-af5a-458c-a660-74b5389d2f50-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.317179 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/746fc1ac-b115-4114-a611-4b2c18e779d3-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.318331 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1252ad03-af5a-458c-a660-74b5389d2f50-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.323004 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.323038 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.323076 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-50b3878a-4497-4564-9702-89e7fb77678b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b3878a-4497-4564-9702-89e7fb77678b\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4cdb09124447966a29ee61d498df90513f08987db8ba6728d6951b191395e556/globalmount\"" pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.323107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9670da-68b7-4ec2-ada3-51c74cabd937-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.323075 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-76bcfd0a-3c22-4c05-9ff1-219b49363978\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-76bcfd0a-3c22-4c05-9ff1-219b49363978\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/77d629f12737047db9a04065634df461a88d63f7a771530dcbeb3e08b3cc1eb0/globalmount\"" pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.323719 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e9670da-68b7-4ec2-ada3-51c74cabd937-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.323780 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9670da-68b7-4ec2-ada3-51c74cabd937-config\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.324145 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.324178 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-620553e4-8453-474f-9db7-85fe4cc4842e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-620553e4-8453-474f-9db7-85fe4cc4842e\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/25f6c2bb410a65958701f122470fb44dd84558d289226216711b36d7f7f5c60a/globalmount\"" pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.324406 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1252ad03-af5a-458c-a660-74b5389d2f50-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.324641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1252ad03-af5a-458c-a660-74b5389d2f50-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.325241 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/746fc1ac-b115-4114-a611-4b2c18e779d3-config\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.325293 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/746fc1ac-b115-4114-a611-4b2c18e779d3-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.325618 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1252ad03-af5a-458c-a660-74b5389d2f50-config\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.327512 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/746fc1ac-b115-4114-a611-4b2c18e779d3-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.331384 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e9670da-68b7-4ec2-ada3-51c74cabd937-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.338243 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcnw9\" (UniqueName: \"kubernetes.io/projected/0e9670da-68b7-4ec2-ada3-51c74cabd937-kube-api-access-kcnw9\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.339845 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crmdb\" (UniqueName: \"kubernetes.io/projected/1252ad03-af5a-458c-a660-74b5389d2f50-kube-api-access-crmdb\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.340072 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbxrt\" (UniqueName: \"kubernetes.io/projected/746fc1ac-b115-4114-a611-4b2c18e779d3-kube-api-access-xbxrt\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.374678 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-50b3878a-4497-4564-9702-89e7fb77678b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b3878a-4497-4564-9702-89e7fb77678b\") pod \"ovsdbserver-sb-1\" (UID: \"746fc1ac-b115-4114-a611-4b2c18e779d3\") " pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.378280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-76bcfd0a-3c22-4c05-9ff1-219b49363978\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-76bcfd0a-3c22-4c05-9ff1-219b49363978\") pod \"ovsdbserver-sb-2\" (UID: \"0e9670da-68b7-4ec2-ada3-51c74cabd937\") " pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.390591 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-620553e4-8453-474f-9db7-85fe4cc4842e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-620553e4-8453-474f-9db7-85fe4cc4842e\") pod \"ovsdbserver-sb-0\" (UID: \"1252ad03-af5a-458c-a660-74b5389d2f50\") " pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.413119 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.591548 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.597133 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.791672 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.868415 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Sep 30 15:01:47 crc kubenswrapper[4763]: W0930 15:01:47.877055 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd44ff345_47d2_4e11_bb92_1b3e00eaba74.slice/crio-6319d9b7cc011168fa2507f377aeed72b9176881935d6a3b7133282f5bbce1e3 WatchSource:0}: Error finding container 6319d9b7cc011168fa2507f377aeed72b9176881935d6a3b7133282f5bbce1e3: Status 404 returned error can't find the container with id 6319d9b7cc011168fa2507f377aeed72b9176881935d6a3b7133282f5bbce1e3 Sep 30 15:01:47 crc kubenswrapper[4763]: I0930 15:01:47.966737 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 15:01:47 crc kubenswrapper[4763]: W0930 15:01:47.971877 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1252ad03_af5a_458c_a660_74b5389d2f50.slice/crio-132b132cd70178e09e031febd8c82824304231dc93aba8bffead5f93d0d0fcff WatchSource:0}: Error finding container 132b132cd70178e09e031febd8c82824304231dc93aba8bffead5f93d0d0fcff: Status 404 returned error can't find the container with id 132b132cd70178e09e031febd8c82824304231dc93aba8bffead5f93d0d0fcff Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.138405 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Sep 30 15:01:48 crc kubenswrapper[4763]: W0930 15:01:48.158220 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e9670da_68b7_4ec2_ada3_51c74cabd937.slice/crio-725afb40c0aebd689c00b3b5ee18c78da2292ea05b888461ac121c9590c7f3b8 WatchSource:0}: Error finding container 725afb40c0aebd689c00b3b5ee18c78da2292ea05b888461ac121c9590c7f3b8: Status 404 returned error can't find the container with id 725afb40c0aebd689c00b3b5ee18c78da2292ea05b888461ac121c9590c7f3b8 Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.219061 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Sep 30 15:01:48 crc kubenswrapper[4763]: W0930 15:01:48.233672 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod746fc1ac_b115_4114_a611_4b2c18e779d3.slice/crio-f48ed52673f11123da0a543793777fc113273a530e3f75c87e931d38f69a4870 WatchSource:0}: Error finding container f48ed52673f11123da0a543793777fc113273a530e3f75c87e931d38f69a4870: Status 404 returned error can't find the container with id f48ed52673f11123da0a543793777fc113273a530e3f75c87e931d38f69a4870 Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.367002 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d44ff345-47d2-4e11-bb92-1b3e00eaba74","Type":"ContainerStarted","Data":"1d66cd18fb3a343d71424e9465af968d8d214caddd766712b00088ed890f506e"} Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.367079 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d44ff345-47d2-4e11-bb92-1b3e00eaba74","Type":"ContainerStarted","Data":"de541269f62bdc3fc74b84369e1f4774d6b7effd5bd51fcf038e342cba60f8d2"} Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.367110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d44ff345-47d2-4e11-bb92-1b3e00eaba74","Type":"ContainerStarted","Data":"6319d9b7cc011168fa2507f377aeed72b9176881935d6a3b7133282f5bbce1e3"} Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.373851 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1252ad03-af5a-458c-a660-74b5389d2f50","Type":"ContainerStarted","Data":"491bef3bdc451c08e352f94995e3dc9bdbf953874078c0c584d76aefe0ab8e64"} Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.373899 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1252ad03-af5a-458c-a660-74b5389d2f50","Type":"ContainerStarted","Data":"132b132cd70178e09e031febd8c82824304231dc93aba8bffead5f93d0d0fcff"} Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.375359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"746fc1ac-b115-4114-a611-4b2c18e779d3","Type":"ContainerStarted","Data":"f48ed52673f11123da0a543793777fc113273a530e3f75c87e931d38f69a4870"} Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.378146 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2b82f57c-bca5-4fad-949d-13d9fdf45d62","Type":"ContainerStarted","Data":"0db37f377c090e955569fc509c7ac0d1691f3b08b6617e95ae8fd2e632f33a40"} Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.378214 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2b82f57c-bca5-4fad-949d-13d9fdf45d62","Type":"ContainerStarted","Data":"837a0695f8da92d0d899314ab744c56e3e6f5eb94cb5b9d874cd042359c4dc81"} Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.378229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2b82f57c-bca5-4fad-949d-13d9fdf45d62","Type":"ContainerStarted","Data":"fbc7a1c9b368f6467d3ba98ab7f0eae046d3b458ccc54bb3619fdea485c142a6"} Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.380307 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"0e9670da-68b7-4ec2-ada3-51c74cabd937","Type":"ContainerStarted","Data":"725afb40c0aebd689c00b3b5ee18c78da2292ea05b888461ac121c9590c7f3b8"} Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.387932 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.387912584 podStartE2EDuration="3.387912584s" podCreationTimestamp="2025-09-30 15:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:01:48.38535459 +0000 UTC m=+5180.523914875" watchObservedRunningTime="2025-09-30 15:01:48.387912584 +0000 UTC m=+5180.526472869" Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.431044 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.431026669 podStartE2EDuration="3.431026669s" podCreationTimestamp="2025-09-30 15:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:01:48.405270976 +0000 UTC m=+5180.543831261" watchObservedRunningTime="2025-09-30 15:01:48.431026669 +0000 UTC m=+5180.569586944" Sep 30 15:01:48 crc kubenswrapper[4763]: I0930 15:01:48.728963 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Sep 30 15:01:48 crc kubenswrapper[4763]: W0930 15:01:48.732677 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b20faf9_26ae_48bf_9293_541e5b2c3468.slice/crio-18bf164ee5baeabbf27d626bb2cbf295196675d3f8c319738a04c419e5a388b1 WatchSource:0}: Error finding container 18bf164ee5baeabbf27d626bb2cbf295196675d3f8c319738a04c419e5a388b1: Status 404 returned error can't find the container with id 18bf164ee5baeabbf27d626bb2cbf295196675d3f8c319738a04c419e5a388b1 Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.389404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1252ad03-af5a-458c-a660-74b5389d2f50","Type":"ContainerStarted","Data":"f080a7a6b69c3b9ef420b20fa25fd3311ec34fe1d102349f7c6181c883f7783b"} Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.391393 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"746fc1ac-b115-4114-a611-4b2c18e779d3","Type":"ContainerStarted","Data":"2f9386debaffef13bc11dbf32d0cec902132cc8b9da23d8762acebd8485cc467"} Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.391429 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"746fc1ac-b115-4114-a611-4b2c18e779d3","Type":"ContainerStarted","Data":"2124d5f38f6f158e68871ac58e5453118b55e978cd22bc9d96b8686bf1bc91cf"} Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.396873 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7b20faf9-26ae-48bf-9293-541e5b2c3468","Type":"ContainerStarted","Data":"26c93725c99fe5fcab1633f5d9826e37f3e661e688f01cf5e1bcee560eef9e54"} Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.396914 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7b20faf9-26ae-48bf-9293-541e5b2c3468","Type":"ContainerStarted","Data":"04140e68671f7fa64a03d572d103e8cccc9e589fad81362f95bf6e42fb8847c3"} Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.396924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7b20faf9-26ae-48bf-9293-541e5b2c3468","Type":"ContainerStarted","Data":"18bf164ee5baeabbf27d626bb2cbf295196675d3f8c319738a04c419e5a388b1"} Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.399705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"0e9670da-68b7-4ec2-ada3-51c74cabd937","Type":"ContainerStarted","Data":"1894d091544602b5044446cabe07b15e194f5d5ff4cbff343c86bea7998d3ce8"} Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.399769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"0e9670da-68b7-4ec2-ada3-51c74cabd937","Type":"ContainerStarted","Data":"208589ab124900abaea025c647e12ee806f27e2f414bccca67b8f62c7e3cf6dd"} Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.408180 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.408161746 podStartE2EDuration="3.408161746s" podCreationTimestamp="2025-09-30 15:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:01:49.407146682 +0000 UTC m=+5181.545706997" watchObservedRunningTime="2025-09-30 15:01:49.408161746 +0000 UTC m=+5181.546722031" Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.429790 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.429769876 podStartE2EDuration="4.429769876s" podCreationTimestamp="2025-09-30 15:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:01:49.424230857 +0000 UTC m=+5181.562791142" watchObservedRunningTime="2025-09-30 15:01:49.429769876 +0000 UTC m=+5181.568330181" Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.450527 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.450506143 podStartE2EDuration="3.450506143s" podCreationTimestamp="2025-09-30 15:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:01:49.441375645 +0000 UTC m=+5181.579935970" watchObservedRunningTime="2025-09-30 15:01:49.450506143 +0000 UTC m=+5181.589066438" Sep 30 15:01:49 crc kubenswrapper[4763]: I0930 15:01:49.461391 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.461367103 podStartE2EDuration="3.461367103s" podCreationTimestamp="2025-09-30 15:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:01:49.458209634 +0000 UTC m=+5181.596769929" watchObservedRunningTime="2025-09-30 15:01:49.461367103 +0000 UTC m=+5181.599927388" Sep 30 15:01:50 crc kubenswrapper[4763]: I0930 15:01:50.236931 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:50 crc kubenswrapper[4763]: I0930 15:01:50.249628 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:50 crc kubenswrapper[4763]: I0930 15:01:50.263514 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:50 crc kubenswrapper[4763]: I0930 15:01:50.285559 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:50 crc kubenswrapper[4763]: I0930 15:01:50.406990 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:50 crc kubenswrapper[4763]: I0930 15:01:50.413529 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:50 crc kubenswrapper[4763]: I0930 15:01:50.592275 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:50 crc kubenswrapper[4763]: I0930 15:01:50.597486 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.250176 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.263281 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.290110 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.413415 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.581091 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685b46d9d5-8jc9r"] Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.582573 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.585214 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.587142 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685b46d9d5-8jc9r"] Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.592090 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.597655 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.705846 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-config\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.706019 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-ovsdbserver-nb\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.706051 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpc5r\" (UniqueName: \"kubernetes.io/projected/2aaa9b0a-b78c-4512-a481-a9859611712e-kube-api-access-xpc5r\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.706127 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-dns-svc\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.807573 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-ovsdbserver-nb\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.807961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpc5r\" (UniqueName: \"kubernetes.io/projected/2aaa9b0a-b78c-4512-a481-a9859611712e-kube-api-access-xpc5r\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.808116 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-dns-svc\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.808344 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-config\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.809147 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-ovsdbserver-nb\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.809181 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-dns-svc\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.809226 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-config\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.826295 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpc5r\" (UniqueName: \"kubernetes.io/projected/2aaa9b0a-b78c-4512-a481-a9859611712e-kube-api-access-xpc5r\") pod \"dnsmasq-dns-685b46d9d5-8jc9r\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:52 crc kubenswrapper[4763]: I0930 15:01:52.901129 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.298476 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.303515 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.346458 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.352560 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685b46d9d5-8jc9r"] Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.446662 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" event={"ID":"2aaa9b0a-b78c-4512-a481-a9859611712e","Type":"ContainerStarted","Data":"d547739e9d2681ccddd8d9341cfe94b365f3d716fe22729adc18bfa4442ba88f"} Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.465577 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.517037 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.518750 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.637389 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.642896 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.696565 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.699914 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.877612 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685b46d9d5-8jc9r"] Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.893982 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86d5b75b95-wdvth"] Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.896229 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.898787 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.911664 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d5b75b95-wdvth"] Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.927458 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-sb\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.927499 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-config\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.927547 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-dns-svc\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.927589 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-nb\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:53 crc kubenswrapper[4763]: I0930 15:01:53.927628 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vbp\" (UniqueName: \"kubernetes.io/projected/110dc091-c67e-40ca-af69-d37995b5c65f-kube-api-access-44vbp\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.029156 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-nb\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.029235 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vbp\" (UniqueName: \"kubernetes.io/projected/110dc091-c67e-40ca-af69-d37995b5c65f-kube-api-access-44vbp\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.029347 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-sb\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.029375 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-config\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.029436 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-dns-svc\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.030429 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-config\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.030648 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-dns-svc\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.030880 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-nb\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.030902 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-sb\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.049634 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vbp\" (UniqueName: \"kubernetes.io/projected/110dc091-c67e-40ca-af69-d37995b5c65f-kube-api-access-44vbp\") pod \"dnsmasq-dns-86d5b75b95-wdvth\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.220237 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.454303 4763 generic.go:334] "Generic (PLEG): container finished" podID="2aaa9b0a-b78c-4512-a481-a9859611712e" containerID="b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27" exitCode=0 Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.454445 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" event={"ID":"2aaa9b0a-b78c-4512-a481-a9859611712e","Type":"ContainerDied","Data":"b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27"} Sep 30 15:01:54 crc kubenswrapper[4763]: I0930 15:01:54.657592 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d5b75b95-wdvth"] Sep 30 15:01:54 crc kubenswrapper[4763]: W0930 15:01:54.658781 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod110dc091_c67e_40ca_af69_d37995b5c65f.slice/crio-fe6e582da481a7a4f98ac415757454e7978812135f36fcbbeca3929de50a1d05 WatchSource:0}: Error finding container fe6e582da481a7a4f98ac415757454e7978812135f36fcbbeca3929de50a1d05: Status 404 returned error can't find the container with id fe6e582da481a7a4f98ac415757454e7978812135f36fcbbeca3929de50a1d05 Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.465256 4763 generic.go:334] "Generic (PLEG): container finished" podID="110dc091-c67e-40ca-af69-d37995b5c65f" containerID="eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24" exitCode=0 Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.465339 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" event={"ID":"110dc091-c67e-40ca-af69-d37995b5c65f","Type":"ContainerDied","Data":"eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24"} Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.465659 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" event={"ID":"110dc091-c67e-40ca-af69-d37995b5c65f","Type":"ContainerStarted","Data":"fe6e582da481a7a4f98ac415757454e7978812135f36fcbbeca3929de50a1d05"} Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.467646 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" event={"ID":"2aaa9b0a-b78c-4512-a481-a9859611712e","Type":"ContainerStarted","Data":"bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2"} Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.467798 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.467779 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" podUID="2aaa9b0a-b78c-4512-a481-a9859611712e" containerName="dnsmasq-dns" containerID="cri-o://bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2" gracePeriod=10 Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.510859 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" podStartSLOduration=3.510842854 podStartE2EDuration="3.510842854s" podCreationTimestamp="2025-09-30 15:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:01:55.507245004 +0000 UTC m=+5187.645805309" watchObservedRunningTime="2025-09-30 15:01:55.510842854 +0000 UTC m=+5187.649403139" Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.898215 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.961550 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-ovsdbserver-nb\") pod \"2aaa9b0a-b78c-4512-a481-a9859611712e\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.961670 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpc5r\" (UniqueName: \"kubernetes.io/projected/2aaa9b0a-b78c-4512-a481-a9859611712e-kube-api-access-xpc5r\") pod \"2aaa9b0a-b78c-4512-a481-a9859611712e\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.961721 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-config\") pod \"2aaa9b0a-b78c-4512-a481-a9859611712e\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.961767 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-dns-svc\") pod \"2aaa9b0a-b78c-4512-a481-a9859611712e\" (UID: \"2aaa9b0a-b78c-4512-a481-a9859611712e\") " Sep 30 15:01:55 crc kubenswrapper[4763]: I0930 15:01:55.967245 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aaa9b0a-b78c-4512-a481-a9859611712e-kube-api-access-xpc5r" (OuterVolumeSpecName: "kube-api-access-xpc5r") pod "2aaa9b0a-b78c-4512-a481-a9859611712e" (UID: "2aaa9b0a-b78c-4512-a481-a9859611712e"). InnerVolumeSpecName "kube-api-access-xpc5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.000898 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2aaa9b0a-b78c-4512-a481-a9859611712e" (UID: "2aaa9b0a-b78c-4512-a481-a9859611712e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.003649 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-config" (OuterVolumeSpecName: "config") pod "2aaa9b0a-b78c-4512-a481-a9859611712e" (UID: "2aaa9b0a-b78c-4512-a481-a9859611712e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.003958 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2aaa9b0a-b78c-4512-a481-a9859611712e" (UID: "2aaa9b0a-b78c-4512-a481-a9859611712e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.063503 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.063550 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.063568 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpc5r\" (UniqueName: \"kubernetes.io/projected/2aaa9b0a-b78c-4512-a481-a9859611712e-kube-api-access-xpc5r\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.063584 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aaa9b0a-b78c-4512-a481-a9859611712e-config\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.360929 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Sep 30 15:01:56 crc kubenswrapper[4763]: E0930 15:01:56.361269 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aaa9b0a-b78c-4512-a481-a9859611712e" containerName="dnsmasq-dns" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.361285 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aaa9b0a-b78c-4512-a481-a9859611712e" containerName="dnsmasq-dns" Sep 30 15:01:56 crc kubenswrapper[4763]: E0930 15:01:56.361309 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aaa9b0a-b78c-4512-a481-a9859611712e" containerName="init" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.361317 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aaa9b0a-b78c-4512-a481-a9859611712e" containerName="init" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.361519 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aaa9b0a-b78c-4512-a481-a9859611712e" containerName="dnsmasq-dns" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.362198 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.364524 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.373698 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.468911 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/e7101812-e158-46b0-af76-063835526dc4-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"e7101812-e158-46b0-af76-063835526dc4\") " pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.469289 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnvk5\" (UniqueName: \"kubernetes.io/projected/e7101812-e158-46b0-af76-063835526dc4-kube-api-access-xnvk5\") pod \"ovn-copy-data\" (UID: \"e7101812-e158-46b0-af76-063835526dc4\") " pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.469345 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-12edd487-a6ba-4bab-b8ed-91b7caf62281\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12edd487-a6ba-4bab-b8ed-91b7caf62281\") pod \"ovn-copy-data\" (UID: \"e7101812-e158-46b0-af76-063835526dc4\") " pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.478696 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" event={"ID":"110dc091-c67e-40ca-af69-d37995b5c65f","Type":"ContainerStarted","Data":"16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815"} Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.479102 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.481169 4763 generic.go:334] "Generic (PLEG): container finished" podID="2aaa9b0a-b78c-4512-a481-a9859611712e" containerID="bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2" exitCode=0 Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.481201 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" event={"ID":"2aaa9b0a-b78c-4512-a481-a9859611712e","Type":"ContainerDied","Data":"bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2"} Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.481220 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" event={"ID":"2aaa9b0a-b78c-4512-a481-a9859611712e","Type":"ContainerDied","Data":"d547739e9d2681ccddd8d9341cfe94b365f3d716fe22729adc18bfa4442ba88f"} Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.481238 4763 scope.go:117] "RemoveContainer" containerID="bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.481356 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685b46d9d5-8jc9r" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.494823 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" podStartSLOduration=3.494802452 podStartE2EDuration="3.494802452s" podCreationTimestamp="2025-09-30 15:01:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:01:56.493742535 +0000 UTC m=+5188.632302840" watchObservedRunningTime="2025-09-30 15:01:56.494802452 +0000 UTC m=+5188.633362737" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.518818 4763 scope.go:117] "RemoveContainer" containerID="b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.521333 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685b46d9d5-8jc9r"] Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.521364 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-685b46d9d5-8jc9r"] Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.540781 4763 scope.go:117] "RemoveContainer" containerID="bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2" Sep 30 15:01:56 crc kubenswrapper[4763]: E0930 15:01:56.541258 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2\": container with ID starting with bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2 not found: ID does not exist" containerID="bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.541292 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2"} err="failed to get container status \"bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2\": rpc error: code = NotFound desc = could not find container \"bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2\": container with ID starting with bbce84069bd8e6d74d58541a836f85d3de367e12ec7fd13507cf097114faf6b2 not found: ID does not exist" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.541313 4763 scope.go:117] "RemoveContainer" containerID="b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27" Sep 30 15:01:56 crc kubenswrapper[4763]: E0930 15:01:56.541660 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27\": container with ID starting with b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27 not found: ID does not exist" containerID="b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.541761 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27"} err="failed to get container status \"b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27\": rpc error: code = NotFound desc = could not find container \"b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27\": container with ID starting with b13ccf3b6e3aec461341964a1fdf8c63087dea31da40bdfa086840db0a0fea27 not found: ID does not exist" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.570844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnvk5\" (UniqueName: \"kubernetes.io/projected/e7101812-e158-46b0-af76-063835526dc4-kube-api-access-xnvk5\") pod \"ovn-copy-data\" (UID: \"e7101812-e158-46b0-af76-063835526dc4\") " pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.570953 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-12edd487-a6ba-4bab-b8ed-91b7caf62281\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12edd487-a6ba-4bab-b8ed-91b7caf62281\") pod \"ovn-copy-data\" (UID: \"e7101812-e158-46b0-af76-063835526dc4\") " pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.571032 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/e7101812-e158-46b0-af76-063835526dc4-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"e7101812-e158-46b0-af76-063835526dc4\") " pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.574381 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.574430 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-12edd487-a6ba-4bab-b8ed-91b7caf62281\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12edd487-a6ba-4bab-b8ed-91b7caf62281\") pod \"ovn-copy-data\" (UID: \"e7101812-e158-46b0-af76-063835526dc4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/01427e21d6f480c11f9349ac9e4ec08af89ceb93975ab26a1977819b0ea9bae6/globalmount\"" pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.577172 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/e7101812-e158-46b0-af76-063835526dc4-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"e7101812-e158-46b0-af76-063835526dc4\") " pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.585779 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnvk5\" (UniqueName: \"kubernetes.io/projected/e7101812-e158-46b0-af76-063835526dc4-kube-api-access-xnvk5\") pod \"ovn-copy-data\" (UID: \"e7101812-e158-46b0-af76-063835526dc4\") " pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.601054 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-12edd487-a6ba-4bab-b8ed-91b7caf62281\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12edd487-a6ba-4bab-b8ed-91b7caf62281\") pod \"ovn-copy-data\" (UID: \"e7101812-e158-46b0-af76-063835526dc4\") " pod="openstack/ovn-copy-data" Sep 30 15:01:56 crc kubenswrapper[4763]: I0930 15:01:56.685028 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Sep 30 15:01:57 crc kubenswrapper[4763]: I0930 15:01:57.163787 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Sep 30 15:01:57 crc kubenswrapper[4763]: W0930 15:01:57.168291 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7101812_e158_46b0_af76_063835526dc4.slice/crio-69e43c00894fe54587c2a0c7ec798fc1204563c3bab621ac268bb1fe92f52140 WatchSource:0}: Error finding container 69e43c00894fe54587c2a0c7ec798fc1204563c3bab621ac268bb1fe92f52140: Status 404 returned error can't find the container with id 69e43c00894fe54587c2a0c7ec798fc1204563c3bab621ac268bb1fe92f52140 Sep 30 15:01:57 crc kubenswrapper[4763]: I0930 15:01:57.489007 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:01:57 crc kubenswrapper[4763]: E0930 15:01:57.489240 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:01:57 crc kubenswrapper[4763]: I0930 15:01:57.492435 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"e7101812-e158-46b0-af76-063835526dc4","Type":"ContainerStarted","Data":"69e43c00894fe54587c2a0c7ec798fc1204563c3bab621ac268bb1fe92f52140"} Sep 30 15:01:58 crc kubenswrapper[4763]: I0930 15:01:58.512595 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aaa9b0a-b78c-4512-a481-a9859611712e" path="/var/lib/kubelet/pods/2aaa9b0a-b78c-4512-a481-a9859611712e/volumes" Sep 30 15:01:58 crc kubenswrapper[4763]: I0930 15:01:58.514020 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"e7101812-e158-46b0-af76-063835526dc4","Type":"ContainerStarted","Data":"6692f75b9a1038d4ced5252d13629cce416fcac4908529a3f9f23691b3f22300"} Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.505300 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=7.830987701 podStartE2EDuration="8.505277247s" podCreationTimestamp="2025-09-30 15:01:55 +0000 UTC" firstStartedPulling="2025-09-30 15:01:57.170410029 +0000 UTC m=+5189.308970314" lastFinishedPulling="2025-09-30 15:01:57.844699575 +0000 UTC m=+5189.983259860" observedRunningTime="2025-09-30 15:01:58.540868285 +0000 UTC m=+5190.679428590" watchObservedRunningTime="2025-09-30 15:02:03.505277247 +0000 UTC m=+5195.643837532" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.507457 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.508863 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.510850 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.511018 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-szppw" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.511126 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.564330 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.582252 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzh99\" (UniqueName: \"kubernetes.io/projected/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-kube-api-access-kzh99\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.582902 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.583016 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.583072 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-scripts\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.583087 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-config\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.684851 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.684981 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.685026 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-scripts\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.685062 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-config\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.685136 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzh99\" (UniqueName: \"kubernetes.io/projected/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-kube-api-access-kzh99\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.685521 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.686126 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-config\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.686127 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-scripts\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.693525 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.704418 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzh99\" (UniqueName: \"kubernetes.io/projected/4f76f17e-9e81-4e2a-b3a1-428e4e54972d-kube-api-access-kzh99\") pod \"ovn-northd-0\" (UID: \"4f76f17e-9e81-4e2a-b3a1-428e4e54972d\") " pod="openstack/ovn-northd-0" Sep 30 15:02:03 crc kubenswrapper[4763]: I0930 15:02:03.835742 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.221870 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.287831 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-xfwmm"] Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.289732 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" podUID="490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" containerName="dnsmasq-dns" containerID="cri-o://5f4145f40a965940038635da21258c9dc3fd4a1feb1bbc2d51d606ee776c5df4" gracePeriod=10 Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.315664 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.555804 4763 generic.go:334] "Generic (PLEG): container finished" podID="490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" containerID="5f4145f40a965940038635da21258c9dc3fd4a1feb1bbc2d51d606ee776c5df4" exitCode=0 Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.555939 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" event={"ID":"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420","Type":"ContainerDied","Data":"5f4145f40a965940038635da21258c9dc3fd4a1feb1bbc2d51d606ee776c5df4"} Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.557977 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f76f17e-9e81-4e2a-b3a1-428e4e54972d","Type":"ContainerStarted","Data":"49ec884d56259fa63cc120351b80704dc184870f144e7cf536862c2bb08844c6"} Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.807652 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.906332 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-config\") pod \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.906562 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-dns-svc\") pod \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.906618 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qv6z\" (UniqueName: \"kubernetes.io/projected/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-kube-api-access-9qv6z\") pod \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\" (UID: \"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420\") " Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.911440 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-kube-api-access-9qv6z" (OuterVolumeSpecName: "kube-api-access-9qv6z") pod "490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" (UID: "490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420"). InnerVolumeSpecName "kube-api-access-9qv6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.949240 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-config" (OuterVolumeSpecName: "config") pod "490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" (UID: "490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:02:04 crc kubenswrapper[4763]: I0930 15:02:04.950387 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" (UID: "490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.008796 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.008842 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qv6z\" (UniqueName: \"kubernetes.io/projected/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-kube-api-access-9qv6z\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.008859 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420-config\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.566367 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f76f17e-9e81-4e2a-b3a1-428e4e54972d","Type":"ContainerStarted","Data":"5844ed86a94a950c1c4f1d94980466bee0f352eca56a02b404e4d8f6da8df0b3"} Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.566660 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.566671 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f76f17e-9e81-4e2a-b3a1-428e4e54972d","Type":"ContainerStarted","Data":"905a51d516d7f532c884c64796e0d74afb0f94084e405265db34856c44c85f60"} Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.568856 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" event={"ID":"490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420","Type":"ContainerDied","Data":"f069333388a71e84f21d931a2321e40e13b61f97b6ba0ad44f465a55506708af"} Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.568890 4763 scope.go:117] "RemoveContainer" containerID="5f4145f40a965940038635da21258c9dc3fd4a1feb1bbc2d51d606ee776c5df4" Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.568968 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6885566dd9-xfwmm" Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.588853 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.588828296 podStartE2EDuration="2.588828296s" podCreationTimestamp="2025-09-30 15:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:02:05.584455617 +0000 UTC m=+5197.723015922" watchObservedRunningTime="2025-09-30 15:02:05.588828296 +0000 UTC m=+5197.727388601" Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.601194 4763 scope.go:117] "RemoveContainer" containerID="82949582a99bf7415dc35246acd3ddcde8be83f79dd91801eb7deb008a73d429" Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.616164 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-xfwmm"] Sep 30 15:02:05 crc kubenswrapper[4763]: I0930 15:02:05.621386 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-xfwmm"] Sep 30 15:02:06 crc kubenswrapper[4763]: I0930 15:02:06.499348 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" path="/var/lib/kubelet/pods/490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420/volumes" Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.494900 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:02:08 crc kubenswrapper[4763]: E0930 15:02:08.495438 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.629475 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-hlsjx"] Sep 30 15:02:08 crc kubenswrapper[4763]: E0930 15:02:08.629920 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" containerName="dnsmasq-dns" Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.629945 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" containerName="dnsmasq-dns" Sep 30 15:02:08 crc kubenswrapper[4763]: E0930 15:02:08.629986 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" containerName="init" Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.629994 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" containerName="init" Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.630182 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="490fc3cc-71b7-4a1b-9cc0-a98cbfb2e420" containerName="dnsmasq-dns" Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.630907 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hlsjx" Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.650271 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hlsjx"] Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.774099 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfh5r\" (UniqueName: \"kubernetes.io/projected/be97e602-d2d5-4dde-b3fc-3f013fabf8fc-kube-api-access-lfh5r\") pod \"keystone-db-create-hlsjx\" (UID: \"be97e602-d2d5-4dde-b3fc-3f013fabf8fc\") " pod="openstack/keystone-db-create-hlsjx" Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.875814 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfh5r\" (UniqueName: \"kubernetes.io/projected/be97e602-d2d5-4dde-b3fc-3f013fabf8fc-kube-api-access-lfh5r\") pod \"keystone-db-create-hlsjx\" (UID: \"be97e602-d2d5-4dde-b3fc-3f013fabf8fc\") " pod="openstack/keystone-db-create-hlsjx" Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.896360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfh5r\" (UniqueName: \"kubernetes.io/projected/be97e602-d2d5-4dde-b3fc-3f013fabf8fc-kube-api-access-lfh5r\") pod \"keystone-db-create-hlsjx\" (UID: \"be97e602-d2d5-4dde-b3fc-3f013fabf8fc\") " pod="openstack/keystone-db-create-hlsjx" Sep 30 15:02:08 crc kubenswrapper[4763]: I0930 15:02:08.954480 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hlsjx" Sep 30 15:02:09 crc kubenswrapper[4763]: I0930 15:02:09.356537 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hlsjx"] Sep 30 15:02:09 crc kubenswrapper[4763]: W0930 15:02:09.362139 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe97e602_d2d5_4dde_b3fc_3f013fabf8fc.slice/crio-1776d05acaf214d0d48259e6fba3c22268d8f1f8034cbae28d91ec72a13ce83b WatchSource:0}: Error finding container 1776d05acaf214d0d48259e6fba3c22268d8f1f8034cbae28d91ec72a13ce83b: Status 404 returned error can't find the container with id 1776d05acaf214d0d48259e6fba3c22268d8f1f8034cbae28d91ec72a13ce83b Sep 30 15:02:09 crc kubenswrapper[4763]: I0930 15:02:09.602359 4763 generic.go:334] "Generic (PLEG): container finished" podID="be97e602-d2d5-4dde-b3fc-3f013fabf8fc" containerID="85ea2a498dd708c3391dcb82b606a930090b4ad77ed41e25c536d13f0d2637b1" exitCode=0 Sep 30 15:02:09 crc kubenswrapper[4763]: I0930 15:02:09.602408 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hlsjx" event={"ID":"be97e602-d2d5-4dde-b3fc-3f013fabf8fc","Type":"ContainerDied","Data":"85ea2a498dd708c3391dcb82b606a930090b4ad77ed41e25c536d13f0d2637b1"} Sep 30 15:02:09 crc kubenswrapper[4763]: I0930 15:02:09.602723 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hlsjx" event={"ID":"be97e602-d2d5-4dde-b3fc-3f013fabf8fc","Type":"ContainerStarted","Data":"1776d05acaf214d0d48259e6fba3c22268d8f1f8034cbae28d91ec72a13ce83b"} Sep 30 15:02:10 crc kubenswrapper[4763]: I0930 15:02:10.916925 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hlsjx" Sep 30 15:02:11 crc kubenswrapper[4763]: I0930 15:02:11.011901 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfh5r\" (UniqueName: \"kubernetes.io/projected/be97e602-d2d5-4dde-b3fc-3f013fabf8fc-kube-api-access-lfh5r\") pod \"be97e602-d2d5-4dde-b3fc-3f013fabf8fc\" (UID: \"be97e602-d2d5-4dde-b3fc-3f013fabf8fc\") " Sep 30 15:02:11 crc kubenswrapper[4763]: I0930 15:02:11.017496 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be97e602-d2d5-4dde-b3fc-3f013fabf8fc-kube-api-access-lfh5r" (OuterVolumeSpecName: "kube-api-access-lfh5r") pod "be97e602-d2d5-4dde-b3fc-3f013fabf8fc" (UID: "be97e602-d2d5-4dde-b3fc-3f013fabf8fc"). InnerVolumeSpecName "kube-api-access-lfh5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:11 crc kubenswrapper[4763]: I0930 15:02:11.114096 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfh5r\" (UniqueName: \"kubernetes.io/projected/be97e602-d2d5-4dde-b3fc-3f013fabf8fc-kube-api-access-lfh5r\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:11 crc kubenswrapper[4763]: I0930 15:02:11.621051 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hlsjx" event={"ID":"be97e602-d2d5-4dde-b3fc-3f013fabf8fc","Type":"ContainerDied","Data":"1776d05acaf214d0d48259e6fba3c22268d8f1f8034cbae28d91ec72a13ce83b"} Sep 30 15:02:11 crc kubenswrapper[4763]: I0930 15:02:11.621370 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1776d05acaf214d0d48259e6fba3c22268d8f1f8034cbae28d91ec72a13ce83b" Sep 30 15:02:11 crc kubenswrapper[4763]: I0930 15:02:11.621428 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hlsjx" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.470517 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5rn7r"] Sep 30 15:02:13 crc kubenswrapper[4763]: E0930 15:02:13.470934 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be97e602-d2d5-4dde-b3fc-3f013fabf8fc" containerName="mariadb-database-create" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.470949 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="be97e602-d2d5-4dde-b3fc-3f013fabf8fc" containerName="mariadb-database-create" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.471158 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="be97e602-d2d5-4dde-b3fc-3f013fabf8fc" containerName="mariadb-database-create" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.472438 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.511623 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rn7r"] Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.550098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-utilities\") pod \"redhat-operators-5rn7r\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.550347 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tvhm\" (UniqueName: \"kubernetes.io/projected/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-kube-api-access-6tvhm\") pod \"redhat-operators-5rn7r\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.550743 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-catalog-content\") pod \"redhat-operators-5rn7r\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.651827 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-catalog-content\") pod \"redhat-operators-5rn7r\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.652166 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-utilities\") pod \"redhat-operators-5rn7r\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.652268 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvhm\" (UniqueName: \"kubernetes.io/projected/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-kube-api-access-6tvhm\") pod \"redhat-operators-5rn7r\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.652704 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-catalog-content\") pod \"redhat-operators-5rn7r\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.652792 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-utilities\") pod \"redhat-operators-5rn7r\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.682542 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tvhm\" (UniqueName: \"kubernetes.io/projected/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-kube-api-access-6tvhm\") pod \"redhat-operators-5rn7r\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:13 crc kubenswrapper[4763]: I0930 15:02:13.808177 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:14 crc kubenswrapper[4763]: W0930 15:02:14.094578 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c87628e_f1c3_43cc_80c3_b6b17f362a8e.slice/crio-f5af9f94874f77b7c4a9d75f76b800c0e488ca1cb08e0292e06ab18121615fbd WatchSource:0}: Error finding container f5af9f94874f77b7c4a9d75f76b800c0e488ca1cb08e0292e06ab18121615fbd: Status 404 returned error can't find the container with id f5af9f94874f77b7c4a9d75f76b800c0e488ca1cb08e0292e06ab18121615fbd Sep 30 15:02:14 crc kubenswrapper[4763]: I0930 15:02:14.094828 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rn7r"] Sep 30 15:02:14 crc kubenswrapper[4763]: I0930 15:02:14.646092 4763 generic.go:334] "Generic (PLEG): container finished" podID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerID="a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b" exitCode=0 Sep 30 15:02:14 crc kubenswrapper[4763]: I0930 15:02:14.646142 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rn7r" event={"ID":"2c87628e-f1c3-43cc-80c3-b6b17f362a8e","Type":"ContainerDied","Data":"a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b"} Sep 30 15:02:14 crc kubenswrapper[4763]: I0930 15:02:14.646172 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rn7r" event={"ID":"2c87628e-f1c3-43cc-80c3-b6b17f362a8e","Type":"ContainerStarted","Data":"f5af9f94874f77b7c4a9d75f76b800c0e488ca1cb08e0292e06ab18121615fbd"} Sep 30 15:02:17 crc kubenswrapper[4763]: I0930 15:02:17.669714 4763 generic.go:334] "Generic (PLEG): container finished" podID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerID="82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193" exitCode=0 Sep 30 15:02:17 crc kubenswrapper[4763]: I0930 15:02:17.669787 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rn7r" event={"ID":"2c87628e-f1c3-43cc-80c3-b6b17f362a8e","Type":"ContainerDied","Data":"82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193"} Sep 30 15:02:18 crc kubenswrapper[4763]: I0930 15:02:18.680128 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rn7r" event={"ID":"2c87628e-f1c3-43cc-80c3-b6b17f362a8e","Type":"ContainerStarted","Data":"e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9"} Sep 30 15:02:18 crc kubenswrapper[4763]: I0930 15:02:18.697687 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5rn7r" podStartSLOduration=2.173821143 podStartE2EDuration="5.69767009s" podCreationTimestamp="2025-09-30 15:02:13 +0000 UTC" firstStartedPulling="2025-09-30 15:02:14.647895149 +0000 UTC m=+5206.786455434" lastFinishedPulling="2025-09-30 15:02:18.171744096 +0000 UTC m=+5210.310304381" observedRunningTime="2025-09-30 15:02:18.694503342 +0000 UTC m=+5210.833063647" watchObservedRunningTime="2025-09-30 15:02:18.69767009 +0000 UTC m=+5210.836230375" Sep 30 15:02:18 crc kubenswrapper[4763]: I0930 15:02:18.729711 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-33f9-account-create-x9dxr"] Sep 30 15:02:18 crc kubenswrapper[4763]: I0930 15:02:18.730790 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-33f9-account-create-x9dxr" Sep 30 15:02:18 crc kubenswrapper[4763]: I0930 15:02:18.732739 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 15:02:18 crc kubenswrapper[4763]: I0930 15:02:18.740087 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-33f9-account-create-x9dxr"] Sep 30 15:02:18 crc kubenswrapper[4763]: I0930 15:02:18.842052 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bx9w\" (UniqueName: \"kubernetes.io/projected/e6579aae-8365-4154-bc8b-c5af7b342ebb-kube-api-access-8bx9w\") pod \"keystone-33f9-account-create-x9dxr\" (UID: \"e6579aae-8365-4154-bc8b-c5af7b342ebb\") " pod="openstack/keystone-33f9-account-create-x9dxr" Sep 30 15:02:18 crc kubenswrapper[4763]: I0930 15:02:18.890166 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 15:02:18 crc kubenswrapper[4763]: I0930 15:02:18.943898 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bx9w\" (UniqueName: \"kubernetes.io/projected/e6579aae-8365-4154-bc8b-c5af7b342ebb-kube-api-access-8bx9w\") pod \"keystone-33f9-account-create-x9dxr\" (UID: \"e6579aae-8365-4154-bc8b-c5af7b342ebb\") " pod="openstack/keystone-33f9-account-create-x9dxr" Sep 30 15:02:18 crc kubenswrapper[4763]: I0930 15:02:18.974212 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bx9w\" (UniqueName: \"kubernetes.io/projected/e6579aae-8365-4154-bc8b-c5af7b342ebb-kube-api-access-8bx9w\") pod \"keystone-33f9-account-create-x9dxr\" (UID: \"e6579aae-8365-4154-bc8b-c5af7b342ebb\") " pod="openstack/keystone-33f9-account-create-x9dxr" Sep 30 15:02:19 crc kubenswrapper[4763]: I0930 15:02:19.047427 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-33f9-account-create-x9dxr" Sep 30 15:02:19 crc kubenswrapper[4763]: I0930 15:02:19.482382 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-33f9-account-create-x9dxr"] Sep 30 15:02:19 crc kubenswrapper[4763]: W0930 15:02:19.486628 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6579aae_8365_4154_bc8b_c5af7b342ebb.slice/crio-559786bd6e90312f82e59491830df8d2c3dafa01188fb1a9abf3c04ccbdb914f WatchSource:0}: Error finding container 559786bd6e90312f82e59491830df8d2c3dafa01188fb1a9abf3c04ccbdb914f: Status 404 returned error can't find the container with id 559786bd6e90312f82e59491830df8d2c3dafa01188fb1a9abf3c04ccbdb914f Sep 30 15:02:19 crc kubenswrapper[4763]: I0930 15:02:19.693043 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-33f9-account-create-x9dxr" event={"ID":"e6579aae-8365-4154-bc8b-c5af7b342ebb","Type":"ContainerStarted","Data":"13781fa7eecf5e7586de82f2051837927da1994ea3e89978d33596bd9b26e154"} Sep 30 15:02:19 crc kubenswrapper[4763]: I0930 15:02:19.693294 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-33f9-account-create-x9dxr" event={"ID":"e6579aae-8365-4154-bc8b-c5af7b342ebb","Type":"ContainerStarted","Data":"559786bd6e90312f82e59491830df8d2c3dafa01188fb1a9abf3c04ccbdb914f"} Sep 30 15:02:19 crc kubenswrapper[4763]: I0930 15:02:19.715518 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-33f9-account-create-x9dxr" podStartSLOduration=1.7154961530000001 podStartE2EDuration="1.715496153s" podCreationTimestamp="2025-09-30 15:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:02:19.711884883 +0000 UTC m=+5211.850445188" watchObservedRunningTime="2025-09-30 15:02:19.715496153 +0000 UTC m=+5211.854056438" Sep 30 15:02:20 crc kubenswrapper[4763]: I0930 15:02:20.700297 4763 generic.go:334] "Generic (PLEG): container finished" podID="e6579aae-8365-4154-bc8b-c5af7b342ebb" containerID="13781fa7eecf5e7586de82f2051837927da1994ea3e89978d33596bd9b26e154" exitCode=0 Sep 30 15:02:20 crc kubenswrapper[4763]: I0930 15:02:20.700339 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-33f9-account-create-x9dxr" event={"ID":"e6579aae-8365-4154-bc8b-c5af7b342ebb","Type":"ContainerDied","Data":"13781fa7eecf5e7586de82f2051837927da1994ea3e89978d33596bd9b26e154"} Sep 30 15:02:22 crc kubenswrapper[4763]: I0930 15:02:22.023652 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-33f9-account-create-x9dxr" Sep 30 15:02:22 crc kubenswrapper[4763]: I0930 15:02:22.095838 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bx9w\" (UniqueName: \"kubernetes.io/projected/e6579aae-8365-4154-bc8b-c5af7b342ebb-kube-api-access-8bx9w\") pod \"e6579aae-8365-4154-bc8b-c5af7b342ebb\" (UID: \"e6579aae-8365-4154-bc8b-c5af7b342ebb\") " Sep 30 15:02:22 crc kubenswrapper[4763]: I0930 15:02:22.101229 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6579aae-8365-4154-bc8b-c5af7b342ebb-kube-api-access-8bx9w" (OuterVolumeSpecName: "kube-api-access-8bx9w") pod "e6579aae-8365-4154-bc8b-c5af7b342ebb" (UID: "e6579aae-8365-4154-bc8b-c5af7b342ebb"). InnerVolumeSpecName "kube-api-access-8bx9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:22 crc kubenswrapper[4763]: I0930 15:02:22.198029 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bx9w\" (UniqueName: \"kubernetes.io/projected/e6579aae-8365-4154-bc8b-c5af7b342ebb-kube-api-access-8bx9w\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:22 crc kubenswrapper[4763]: I0930 15:02:22.489464 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:02:22 crc kubenswrapper[4763]: E0930 15:02:22.490061 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:02:22 crc kubenswrapper[4763]: I0930 15:02:22.716150 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-33f9-account-create-x9dxr" event={"ID":"e6579aae-8365-4154-bc8b-c5af7b342ebb","Type":"ContainerDied","Data":"559786bd6e90312f82e59491830df8d2c3dafa01188fb1a9abf3c04ccbdb914f"} Sep 30 15:02:22 crc kubenswrapper[4763]: I0930 15:02:22.716189 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-33f9-account-create-x9dxr" Sep 30 15:02:22 crc kubenswrapper[4763]: I0930 15:02:22.716197 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="559786bd6e90312f82e59491830df8d2c3dafa01188fb1a9abf3c04ccbdb914f" Sep 30 15:02:23 crc kubenswrapper[4763]: I0930 15:02:23.808767 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:23 crc kubenswrapper[4763]: I0930 15:02:23.808814 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:23 crc kubenswrapper[4763]: I0930 15:02:23.861325 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.185055 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wqkxw"] Sep 30 15:02:24 crc kubenswrapper[4763]: E0930 15:02:24.185439 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6579aae-8365-4154-bc8b-c5af7b342ebb" containerName="mariadb-account-create" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.185465 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6579aae-8365-4154-bc8b-c5af7b342ebb" containerName="mariadb-account-create" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.185799 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6579aae-8365-4154-bc8b-c5af7b342ebb" containerName="mariadb-account-create" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.186492 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.188924 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rkqp7" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.189222 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.189373 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.190865 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.194378 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wqkxw"] Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.231557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-config-data\") pod \"keystone-db-sync-wqkxw\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.231682 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mbb\" (UniqueName: \"kubernetes.io/projected/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-kube-api-access-55mbb\") pod \"keystone-db-sync-wqkxw\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.231767 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-combined-ca-bundle\") pod \"keystone-db-sync-wqkxw\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.333089 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-combined-ca-bundle\") pod \"keystone-db-sync-wqkxw\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.333170 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-config-data\") pod \"keystone-db-sync-wqkxw\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.333229 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55mbb\" (UniqueName: \"kubernetes.io/projected/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-kube-api-access-55mbb\") pod \"keystone-db-sync-wqkxw\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.340484 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-config-data\") pod \"keystone-db-sync-wqkxw\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.346411 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-combined-ca-bundle\") pod \"keystone-db-sync-wqkxw\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.350548 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mbb\" (UniqueName: \"kubernetes.io/projected/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-kube-api-access-55mbb\") pod \"keystone-db-sync-wqkxw\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.513433 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.779903 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:24 crc kubenswrapper[4763]: I0930 15:02:24.828818 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rn7r"] Sep 30 15:02:25 crc kubenswrapper[4763]: I0930 15:02:25.055256 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wqkxw"] Sep 30 15:02:25 crc kubenswrapper[4763]: W0930 15:02:25.055843 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae0d63a0_c59a_4eb2_ae24_b65200d012b8.slice/crio-2ca495006e2005103e93354a23124b9510918ad741bfef3d4c1d76a2f9ba9a4c WatchSource:0}: Error finding container 2ca495006e2005103e93354a23124b9510918ad741bfef3d4c1d76a2f9ba9a4c: Status 404 returned error can't find the container with id 2ca495006e2005103e93354a23124b9510918ad741bfef3d4c1d76a2f9ba9a4c Sep 30 15:02:25 crc kubenswrapper[4763]: I0930 15:02:25.741160 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wqkxw" event={"ID":"ae0d63a0-c59a-4eb2-ae24-b65200d012b8","Type":"ContainerStarted","Data":"2037c6e333a957fe66075dc6c0a96844cde6ca1b7eafd9b094ed58e953b44cbb"} Sep 30 15:02:25 crc kubenswrapper[4763]: I0930 15:02:25.741488 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wqkxw" event={"ID":"ae0d63a0-c59a-4eb2-ae24-b65200d012b8","Type":"ContainerStarted","Data":"2ca495006e2005103e93354a23124b9510918ad741bfef3d4c1d76a2f9ba9a4c"} Sep 30 15:02:25 crc kubenswrapper[4763]: I0930 15:02:25.764685 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wqkxw" podStartSLOduration=1.764663557 podStartE2EDuration="1.764663557s" podCreationTimestamp="2025-09-30 15:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:02:25.759101568 +0000 UTC m=+5217.897661853" watchObservedRunningTime="2025-09-30 15:02:25.764663557 +0000 UTC m=+5217.903223842" Sep 30 15:02:26 crc kubenswrapper[4763]: I0930 15:02:26.748044 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5rn7r" podUID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerName="registry-server" containerID="cri-o://e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9" gracePeriod=2 Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.159134 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.286392 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-utilities\") pod \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.286558 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tvhm\" (UniqueName: \"kubernetes.io/projected/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-kube-api-access-6tvhm\") pod \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.286715 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-catalog-content\") pod \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\" (UID: \"2c87628e-f1c3-43cc-80c3-b6b17f362a8e\") " Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.288236 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-utilities" (OuterVolumeSpecName: "utilities") pod "2c87628e-f1c3-43cc-80c3-b6b17f362a8e" (UID: "2c87628e-f1c3-43cc-80c3-b6b17f362a8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.298154 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-kube-api-access-6tvhm" (OuterVolumeSpecName: "kube-api-access-6tvhm") pod "2c87628e-f1c3-43cc-80c3-b6b17f362a8e" (UID: "2c87628e-f1c3-43cc-80c3-b6b17f362a8e"). InnerVolumeSpecName "kube-api-access-6tvhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.388449 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tvhm\" (UniqueName: \"kubernetes.io/projected/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-kube-api-access-6tvhm\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.388481 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.756470 4763 generic.go:334] "Generic (PLEG): container finished" podID="ae0d63a0-c59a-4eb2-ae24-b65200d012b8" containerID="2037c6e333a957fe66075dc6c0a96844cde6ca1b7eafd9b094ed58e953b44cbb" exitCode=0 Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.756559 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wqkxw" event={"ID":"ae0d63a0-c59a-4eb2-ae24-b65200d012b8","Type":"ContainerDied","Data":"2037c6e333a957fe66075dc6c0a96844cde6ca1b7eafd9b094ed58e953b44cbb"} Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.758906 4763 generic.go:334] "Generic (PLEG): container finished" podID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerID="e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9" exitCode=0 Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.758948 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rn7r" event={"ID":"2c87628e-f1c3-43cc-80c3-b6b17f362a8e","Type":"ContainerDied","Data":"e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9"} Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.758960 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rn7r" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.758980 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rn7r" event={"ID":"2c87628e-f1c3-43cc-80c3-b6b17f362a8e","Type":"ContainerDied","Data":"f5af9f94874f77b7c4a9d75f76b800c0e488ca1cb08e0292e06ab18121615fbd"} Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.759002 4763 scope.go:117] "RemoveContainer" containerID="e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.776976 4763 scope.go:117] "RemoveContainer" containerID="82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.802587 4763 scope.go:117] "RemoveContainer" containerID="a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.831771 4763 scope.go:117] "RemoveContainer" containerID="e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9" Sep 30 15:02:27 crc kubenswrapper[4763]: E0930 15:02:27.832101 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9\": container with ID starting with e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9 not found: ID does not exist" containerID="e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.832138 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9"} err="failed to get container status \"e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9\": rpc error: code = NotFound desc = could not find container \"e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9\": container with ID starting with e0bc157c2246d2782d60519fba4fa538bd60d8dd3d0865d34f20baf76df12db9 not found: ID does not exist" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.832161 4763 scope.go:117] "RemoveContainer" containerID="82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193" Sep 30 15:02:27 crc kubenswrapper[4763]: E0930 15:02:27.832379 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193\": container with ID starting with 82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193 not found: ID does not exist" containerID="82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.832399 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193"} err="failed to get container status \"82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193\": rpc error: code = NotFound desc = could not find container \"82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193\": container with ID starting with 82b3c4d5c2ca20f03e3993828f390e8e037ed8dc02f20b5ad663517c66251193 not found: ID does not exist" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.832411 4763 scope.go:117] "RemoveContainer" containerID="a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b" Sep 30 15:02:27 crc kubenswrapper[4763]: E0930 15:02:27.832581 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b\": container with ID starting with a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b not found: ID does not exist" containerID="a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b" Sep 30 15:02:27 crc kubenswrapper[4763]: I0930 15:02:27.832628 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b"} err="failed to get container status \"a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b\": rpc error: code = NotFound desc = could not find container \"a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b\": container with ID starting with a77fa054d4adebb383a323632d86a0560d0885bfc46f7cf1eb2ddff0cea1735b not found: ID does not exist" Sep 30 15:02:28 crc kubenswrapper[4763]: I0930 15:02:28.550155 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c87628e-f1c3-43cc-80c3-b6b17f362a8e" (UID: "2c87628e-f1c3-43cc-80c3-b6b17f362a8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:02:28 crc kubenswrapper[4763]: I0930 15:02:28.614781 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c87628e-f1c3-43cc-80c3-b6b17f362a8e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:28 crc kubenswrapper[4763]: I0930 15:02:28.696450 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rn7r"] Sep 30 15:02:28 crc kubenswrapper[4763]: I0930 15:02:28.703967 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5rn7r"] Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.094843 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.123938 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55mbb\" (UniqueName: \"kubernetes.io/projected/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-kube-api-access-55mbb\") pod \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.124016 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-combined-ca-bundle\") pod \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.124129 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-config-data\") pod \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\" (UID: \"ae0d63a0-c59a-4eb2-ae24-b65200d012b8\") " Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.139103 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-kube-api-access-55mbb" (OuterVolumeSpecName: "kube-api-access-55mbb") pod "ae0d63a0-c59a-4eb2-ae24-b65200d012b8" (UID: "ae0d63a0-c59a-4eb2-ae24-b65200d012b8"). InnerVolumeSpecName "kube-api-access-55mbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.171807 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae0d63a0-c59a-4eb2-ae24-b65200d012b8" (UID: "ae0d63a0-c59a-4eb2-ae24-b65200d012b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.185083 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-config-data" (OuterVolumeSpecName: "config-data") pod "ae0d63a0-c59a-4eb2-ae24-b65200d012b8" (UID: "ae0d63a0-c59a-4eb2-ae24-b65200d012b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.225647 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.225677 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.225689 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55mbb\" (UniqueName: \"kubernetes.io/projected/ae0d63a0-c59a-4eb2-ae24-b65200d012b8-kube-api-access-55mbb\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.776035 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wqkxw" event={"ID":"ae0d63a0-c59a-4eb2-ae24-b65200d012b8","Type":"ContainerDied","Data":"2ca495006e2005103e93354a23124b9510918ad741bfef3d4c1d76a2f9ba9a4c"} Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.776353 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ca495006e2005103e93354a23124b9510918ad741bfef3d4c1d76a2f9ba9a4c" Sep 30 15:02:29 crc kubenswrapper[4763]: I0930 15:02:29.776119 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wqkxw" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.022395 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dcb497f8f-c9md6"] Sep 30 15:02:30 crc kubenswrapper[4763]: E0930 15:02:30.023092 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerName="extract-content" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.023119 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerName="extract-content" Sep 30 15:02:30 crc kubenswrapper[4763]: E0930 15:02:30.023138 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0d63a0-c59a-4eb2-ae24-b65200d012b8" containerName="keystone-db-sync" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.023145 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0d63a0-c59a-4eb2-ae24-b65200d012b8" containerName="keystone-db-sync" Sep 30 15:02:30 crc kubenswrapper[4763]: E0930 15:02:30.023171 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerName="extract-utilities" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.023178 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerName="extract-utilities" Sep 30 15:02:30 crc kubenswrapper[4763]: E0930 15:02:30.023191 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerName="registry-server" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.023197 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerName="registry-server" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.023333 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae0d63a0-c59a-4eb2-ae24-b65200d012b8" containerName="keystone-db-sync" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.023354 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" containerName="registry-server" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.024168 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.037404 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dcb497f8f-c9md6"] Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.068342 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-25xrb"] Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.071767 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.078862 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rkqp7" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.081092 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.081294 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.081415 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.127232 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-25xrb"] Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.138865 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj7vl\" (UniqueName: \"kubernetes.io/projected/e339af44-3091-43cf-97d5-f0ea9f55a33d-kube-api-access-nj7vl\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.138910 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-scripts\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.138941 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h77x5\" (UniqueName: \"kubernetes.io/projected/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-kube-api-access-h77x5\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.138969 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-config-data\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.138989 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.139020 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-config\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.139044 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-combined-ca-bundle\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.139095 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.139153 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-credential-keys\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.139174 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-dns-svc\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.139188 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-fernet-keys\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.240725 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj7vl\" (UniqueName: \"kubernetes.io/projected/e339af44-3091-43cf-97d5-f0ea9f55a33d-kube-api-access-nj7vl\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.240783 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-scripts\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.240811 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h77x5\" (UniqueName: \"kubernetes.io/projected/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-kube-api-access-h77x5\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.240830 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-config-data\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.240846 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.240879 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-config\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.240941 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-combined-ca-bundle\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.240973 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.241028 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-credential-keys\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.241056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-dns-svc\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.241078 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-fernet-keys\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.241939 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.241985 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.242403 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-config\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.242570 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e339af44-3091-43cf-97d5-f0ea9f55a33d-dns-svc\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.244500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-scripts\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.244798 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-fernet-keys\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.244940 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-credential-keys\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.257964 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h77x5\" (UniqueName: \"kubernetes.io/projected/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-kube-api-access-h77x5\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.258320 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-config-data\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.258498 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj7vl\" (UniqueName: \"kubernetes.io/projected/e339af44-3091-43cf-97d5-f0ea9f55a33d-kube-api-access-nj7vl\") pod \"dnsmasq-dns-5dcb497f8f-c9md6\" (UID: \"e339af44-3091-43cf-97d5-f0ea9f55a33d\") " pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.259163 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-combined-ca-bundle\") pod \"keystone-bootstrap-25xrb\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.345390 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.401824 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.501373 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c87628e-f1c3-43cc-80c3-b6b17f362a8e" path="/var/lib/kubelet/pods/2c87628e-f1c3-43cc-80c3-b6b17f362a8e/volumes" Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.775015 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dcb497f8f-c9md6"] Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.786143 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" event={"ID":"e339af44-3091-43cf-97d5-f0ea9f55a33d","Type":"ContainerStarted","Data":"8d8293c1a7e5686b569945a3932514d0c0d56fe36a0c88d7a2b8e72a97419378"} Sep 30 15:02:30 crc kubenswrapper[4763]: I0930 15:02:30.885289 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-25xrb"] Sep 30 15:02:30 crc kubenswrapper[4763]: W0930 15:02:30.892213 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71bfb59e_8908_4b06_a1f9_e984d6a0b76f.slice/crio-0afcc72ccf868d8ed622334ed9fe68175ec45869c70885c7b062d6d450ac5749 WatchSource:0}: Error finding container 0afcc72ccf868d8ed622334ed9fe68175ec45869c70885c7b062d6d450ac5749: Status 404 returned error can't find the container with id 0afcc72ccf868d8ed622334ed9fe68175ec45869c70885c7b062d6d450ac5749 Sep 30 15:02:31 crc kubenswrapper[4763]: I0930 15:02:31.797939 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-25xrb" event={"ID":"71bfb59e-8908-4b06-a1f9-e984d6a0b76f","Type":"ContainerStarted","Data":"b6475298ad08bc581075c127ce07a3ce2facddc3228d965287013961ed5c4ace"} Sep 30 15:02:31 crc kubenswrapper[4763]: I0930 15:02:31.797988 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-25xrb" event={"ID":"71bfb59e-8908-4b06-a1f9-e984d6a0b76f","Type":"ContainerStarted","Data":"0afcc72ccf868d8ed622334ed9fe68175ec45869c70885c7b062d6d450ac5749"} Sep 30 15:02:31 crc kubenswrapper[4763]: I0930 15:02:31.800695 4763 generic.go:334] "Generic (PLEG): container finished" podID="e339af44-3091-43cf-97d5-f0ea9f55a33d" containerID="a2349604a65c67d1fb95e2006c7abd9856cc2ca2f59837c433891851c19c4c94" exitCode=0 Sep 30 15:02:31 crc kubenswrapper[4763]: I0930 15:02:31.800748 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" event={"ID":"e339af44-3091-43cf-97d5-f0ea9f55a33d","Type":"ContainerDied","Data":"a2349604a65c67d1fb95e2006c7abd9856cc2ca2f59837c433891851c19c4c94"} Sep 30 15:02:31 crc kubenswrapper[4763]: I0930 15:02:31.813587 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-25xrb" podStartSLOduration=1.813567393 podStartE2EDuration="1.813567393s" podCreationTimestamp="2025-09-30 15:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:02:31.811581404 +0000 UTC m=+5223.950141709" watchObservedRunningTime="2025-09-30 15:02:31.813567393 +0000 UTC m=+5223.952127699" Sep 30 15:02:32 crc kubenswrapper[4763]: I0930 15:02:32.809758 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" event={"ID":"e339af44-3091-43cf-97d5-f0ea9f55a33d","Type":"ContainerStarted","Data":"512c3a33588eee96852f042cf641b023f0d01f4d1e0c6e5a746c032f8358c49f"} Sep 30 15:02:32 crc kubenswrapper[4763]: I0930 15:02:32.810299 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:32 crc kubenswrapper[4763]: I0930 15:02:32.830694 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" podStartSLOduration=3.830675897 podStartE2EDuration="3.830675897s" podCreationTimestamp="2025-09-30 15:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:02:32.827637152 +0000 UTC m=+5224.966197437" watchObservedRunningTime="2025-09-30 15:02:32.830675897 +0000 UTC m=+5224.969236182" Sep 30 15:02:34 crc kubenswrapper[4763]: I0930 15:02:34.825630 4763 generic.go:334] "Generic (PLEG): container finished" podID="71bfb59e-8908-4b06-a1f9-e984d6a0b76f" containerID="b6475298ad08bc581075c127ce07a3ce2facddc3228d965287013961ed5c4ace" exitCode=0 Sep 30 15:02:34 crc kubenswrapper[4763]: I0930 15:02:34.825699 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-25xrb" event={"ID":"71bfb59e-8908-4b06-a1f9-e984d6a0b76f","Type":"ContainerDied","Data":"b6475298ad08bc581075c127ce07a3ce2facddc3228d965287013961ed5c4ace"} Sep 30 15:02:35 crc kubenswrapper[4763]: I0930 15:02:35.489224 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:02:35 crc kubenswrapper[4763]: E0930 15:02:35.489732 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.178353 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.237830 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-credential-keys\") pod \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.237878 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h77x5\" (UniqueName: \"kubernetes.io/projected/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-kube-api-access-h77x5\") pod \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.237906 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-scripts\") pod \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.237941 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-fernet-keys\") pod \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.238799 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-config-data\") pod \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.239118 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-combined-ca-bundle\") pod \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\" (UID: \"71bfb59e-8908-4b06-a1f9-e984d6a0b76f\") " Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.243126 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "71bfb59e-8908-4b06-a1f9-e984d6a0b76f" (UID: "71bfb59e-8908-4b06-a1f9-e984d6a0b76f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.243693 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-kube-api-access-h77x5" (OuterVolumeSpecName: "kube-api-access-h77x5") pod "71bfb59e-8908-4b06-a1f9-e984d6a0b76f" (UID: "71bfb59e-8908-4b06-a1f9-e984d6a0b76f"). InnerVolumeSpecName "kube-api-access-h77x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.244784 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "71bfb59e-8908-4b06-a1f9-e984d6a0b76f" (UID: "71bfb59e-8908-4b06-a1f9-e984d6a0b76f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.260836 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-scripts" (OuterVolumeSpecName: "scripts") pod "71bfb59e-8908-4b06-a1f9-e984d6a0b76f" (UID: "71bfb59e-8908-4b06-a1f9-e984d6a0b76f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.264241 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71bfb59e-8908-4b06-a1f9-e984d6a0b76f" (UID: "71bfb59e-8908-4b06-a1f9-e984d6a0b76f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.266297 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-config-data" (OuterVolumeSpecName: "config-data") pod "71bfb59e-8908-4b06-a1f9-e984d6a0b76f" (UID: "71bfb59e-8908-4b06-a1f9-e984d6a0b76f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.340842 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.340884 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.340892 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h77x5\" (UniqueName: \"kubernetes.io/projected/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-kube-api-access-h77x5\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.340905 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.340913 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.340921 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bfb59e-8908-4b06-a1f9-e984d6a0b76f-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.841203 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-25xrb" event={"ID":"71bfb59e-8908-4b06-a1f9-e984d6a0b76f","Type":"ContainerDied","Data":"0afcc72ccf868d8ed622334ed9fe68175ec45869c70885c7b062d6d450ac5749"} Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.841241 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0afcc72ccf868d8ed622334ed9fe68175ec45869c70885c7b062d6d450ac5749" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.841268 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-25xrb" Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.922619 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-25xrb"] Sep 30 15:02:36 crc kubenswrapper[4763]: I0930 15:02:36.928008 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-25xrb"] Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.012164 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-txslw"] Sep 30 15:02:37 crc kubenswrapper[4763]: E0930 15:02:37.012591 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bfb59e-8908-4b06-a1f9-e984d6a0b76f" containerName="keystone-bootstrap" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.012633 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bfb59e-8908-4b06-a1f9-e984d6a0b76f" containerName="keystone-bootstrap" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.012850 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bfb59e-8908-4b06-a1f9-e984d6a0b76f" containerName="keystone-bootstrap" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.013518 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.015662 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.015667 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rkqp7" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.016138 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.016159 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.029553 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-txslw"] Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.053155 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-combined-ca-bundle\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.053226 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-config-data\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.053264 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-scripts\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.053311 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xfmm\" (UniqueName: \"kubernetes.io/projected/b793587a-a139-454e-9837-a388a88f9129-kube-api-access-5xfmm\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.053474 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-fernet-keys\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.053539 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-credential-keys\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.155119 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-fernet-keys\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.155171 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-credential-keys\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.155217 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-combined-ca-bundle\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.155247 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-config-data\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.155271 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-scripts\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.155299 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xfmm\" (UniqueName: \"kubernetes.io/projected/b793587a-a139-454e-9837-a388a88f9129-kube-api-access-5xfmm\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.160740 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-combined-ca-bundle\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.160812 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-fernet-keys\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.161891 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-credential-keys\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.162169 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-config-data\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.163810 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-scripts\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.191300 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xfmm\" (UniqueName: \"kubernetes.io/projected/b793587a-a139-454e-9837-a388a88f9129-kube-api-access-5xfmm\") pod \"keystone-bootstrap-txslw\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.328748 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.752501 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-txslw"] Sep 30 15:02:37 crc kubenswrapper[4763]: I0930 15:02:37.850344 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-txslw" event={"ID":"b793587a-a139-454e-9837-a388a88f9129","Type":"ContainerStarted","Data":"db13da51c95694b5b75ece155f38e55595faabd9a6779e81ba64981f0838274f"} Sep 30 15:02:38 crc kubenswrapper[4763]: I0930 15:02:38.526412 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bfb59e-8908-4b06-a1f9-e984d6a0b76f" path="/var/lib/kubelet/pods/71bfb59e-8908-4b06-a1f9-e984d6a0b76f/volumes" Sep 30 15:02:38 crc kubenswrapper[4763]: I0930 15:02:38.879870 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-txslw" event={"ID":"b793587a-a139-454e-9837-a388a88f9129","Type":"ContainerStarted","Data":"9a4b2e6b3eacaed3f798846d7d67802d3c12ac219becc86f971e03e25c2a2d8e"} Sep 30 15:02:38 crc kubenswrapper[4763]: I0930 15:02:38.901833 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-txslw" podStartSLOduration=2.901815438 podStartE2EDuration="2.901815438s" podCreationTimestamp="2025-09-30 15:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:02:38.895410539 +0000 UTC m=+5231.033970824" watchObservedRunningTime="2025-09-30 15:02:38.901815438 +0000 UTC m=+5231.040375723" Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.346792 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dcb497f8f-c9md6" Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.393031 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d5b75b95-wdvth"] Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.393326 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" podUID="110dc091-c67e-40ca-af69-d37995b5c65f" containerName="dnsmasq-dns" containerID="cri-o://16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815" gracePeriod=10 Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.860689 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.910475 4763 generic.go:334] "Generic (PLEG): container finished" podID="110dc091-c67e-40ca-af69-d37995b5c65f" containerID="16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815" exitCode=0 Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.910705 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.911237 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" event={"ID":"110dc091-c67e-40ca-af69-d37995b5c65f","Type":"ContainerDied","Data":"16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815"} Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.911268 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d5b75b95-wdvth" event={"ID":"110dc091-c67e-40ca-af69-d37995b5c65f","Type":"ContainerDied","Data":"fe6e582da481a7a4f98ac415757454e7978812135f36fcbbeca3929de50a1d05"} Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.911290 4763 scope.go:117] "RemoveContainer" containerID="16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815" Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.915098 4763 generic.go:334] "Generic (PLEG): container finished" podID="b793587a-a139-454e-9837-a388a88f9129" containerID="9a4b2e6b3eacaed3f798846d7d67802d3c12ac219becc86f971e03e25c2a2d8e" exitCode=0 Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.915129 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-txslw" event={"ID":"b793587a-a139-454e-9837-a388a88f9129","Type":"ContainerDied","Data":"9a4b2e6b3eacaed3f798846d7d67802d3c12ac219becc86f971e03e25c2a2d8e"} Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.927992 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-sb\") pod \"110dc091-c67e-40ca-af69-d37995b5c65f\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.928058 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44vbp\" (UniqueName: \"kubernetes.io/projected/110dc091-c67e-40ca-af69-d37995b5c65f-kube-api-access-44vbp\") pod \"110dc091-c67e-40ca-af69-d37995b5c65f\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.928079 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-nb\") pod \"110dc091-c67e-40ca-af69-d37995b5c65f\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.928140 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-config\") pod \"110dc091-c67e-40ca-af69-d37995b5c65f\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.928271 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-dns-svc\") pod \"110dc091-c67e-40ca-af69-d37995b5c65f\" (UID: \"110dc091-c67e-40ca-af69-d37995b5c65f\") " Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.947030 4763 scope.go:117] "RemoveContainer" containerID="eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24" Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.947023 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110dc091-c67e-40ca-af69-d37995b5c65f-kube-api-access-44vbp" (OuterVolumeSpecName: "kube-api-access-44vbp") pod "110dc091-c67e-40ca-af69-d37995b5c65f" (UID: "110dc091-c67e-40ca-af69-d37995b5c65f"). InnerVolumeSpecName "kube-api-access-44vbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.973615 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "110dc091-c67e-40ca-af69-d37995b5c65f" (UID: "110dc091-c67e-40ca-af69-d37995b5c65f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.973677 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-config" (OuterVolumeSpecName: "config") pod "110dc091-c67e-40ca-af69-d37995b5c65f" (UID: "110dc091-c67e-40ca-af69-d37995b5c65f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.981842 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "110dc091-c67e-40ca-af69-d37995b5c65f" (UID: "110dc091-c67e-40ca-af69-d37995b5c65f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:02:40 crc kubenswrapper[4763]: I0930 15:02:40.990851 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "110dc091-c67e-40ca-af69-d37995b5c65f" (UID: "110dc091-c67e-40ca-af69-d37995b5c65f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.021018 4763 scope.go:117] "RemoveContainer" containerID="16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815" Sep 30 15:02:41 crc kubenswrapper[4763]: E0930 15:02:41.021726 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815\": container with ID starting with 16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815 not found: ID does not exist" containerID="16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815" Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.021877 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815"} err="failed to get container status \"16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815\": rpc error: code = NotFound desc = could not find container \"16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815\": container with ID starting with 16a95f20361d226e6d49e7b09d09e85c9e7bcb7968ca25edb775feedfe15e815 not found: ID does not exist" Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.021970 4763 scope.go:117] "RemoveContainer" containerID="eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24" Sep 30 15:02:41 crc kubenswrapper[4763]: E0930 15:02:41.022423 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24\": container with ID starting with eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24 not found: ID does not exist" containerID="eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24" Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.022466 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24"} err="failed to get container status \"eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24\": rpc error: code = NotFound desc = could not find container \"eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24\": container with ID starting with eba587986a0110cb0916e039d76b2cbeeb3ddc94c8f956fe975ff293cf3e4d24 not found: ID does not exist" Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.030794 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.031060 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.031144 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44vbp\" (UniqueName: \"kubernetes.io/projected/110dc091-c67e-40ca-af69-d37995b5c65f-kube-api-access-44vbp\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.031221 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.031288 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/110dc091-c67e-40ca-af69-d37995b5c65f-config\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.243286 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d5b75b95-wdvth"] Sep 30 15:02:41 crc kubenswrapper[4763]: I0930 15:02:41.249261 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86d5b75b95-wdvth"] Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.243315 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.393589 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-credential-keys\") pod \"b793587a-a139-454e-9837-a388a88f9129\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.393735 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-scripts\") pod \"b793587a-a139-454e-9837-a388a88f9129\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.393770 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-config-data\") pod \"b793587a-a139-454e-9837-a388a88f9129\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.393802 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-fernet-keys\") pod \"b793587a-a139-454e-9837-a388a88f9129\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.393910 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xfmm\" (UniqueName: \"kubernetes.io/projected/b793587a-a139-454e-9837-a388a88f9129-kube-api-access-5xfmm\") pod \"b793587a-a139-454e-9837-a388a88f9129\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.393973 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-combined-ca-bundle\") pod \"b793587a-a139-454e-9837-a388a88f9129\" (UID: \"b793587a-a139-454e-9837-a388a88f9129\") " Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.397406 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-scripts" (OuterVolumeSpecName: "scripts") pod "b793587a-a139-454e-9837-a388a88f9129" (UID: "b793587a-a139-454e-9837-a388a88f9129"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.397686 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b793587a-a139-454e-9837-a388a88f9129-kube-api-access-5xfmm" (OuterVolumeSpecName: "kube-api-access-5xfmm") pod "b793587a-a139-454e-9837-a388a88f9129" (UID: "b793587a-a139-454e-9837-a388a88f9129"). InnerVolumeSpecName "kube-api-access-5xfmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.397945 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b793587a-a139-454e-9837-a388a88f9129" (UID: "b793587a-a139-454e-9837-a388a88f9129"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.403914 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b793587a-a139-454e-9837-a388a88f9129" (UID: "b793587a-a139-454e-9837-a388a88f9129"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.418776 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b793587a-a139-454e-9837-a388a88f9129" (UID: "b793587a-a139-454e-9837-a388a88f9129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.420060 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-config-data" (OuterVolumeSpecName: "config-data") pod "b793587a-a139-454e-9837-a388a88f9129" (UID: "b793587a-a139-454e-9837-a388a88f9129"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.495858 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.496116 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.496128 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.496137 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xfmm\" (UniqueName: \"kubernetes.io/projected/b793587a-a139-454e-9837-a388a88f9129-kube-api-access-5xfmm\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.496146 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.496181 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b793587a-a139-454e-9837-a388a88f9129-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.500762 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="110dc091-c67e-40ca-af69-d37995b5c65f" path="/var/lib/kubelet/pods/110dc091-c67e-40ca-af69-d37995b5c65f/volumes" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.934468 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-txslw" event={"ID":"b793587a-a139-454e-9837-a388a88f9129","Type":"ContainerDied","Data":"db13da51c95694b5b75ece155f38e55595faabd9a6779e81ba64981f0838274f"} Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.934512 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db13da51c95694b5b75ece155f38e55595faabd9a6779e81ba64981f0838274f" Sep 30 15:02:42 crc kubenswrapper[4763]: I0930 15:02:42.934569 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-txslw" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.112369 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c84dcf974-nst5l"] Sep 30 15:02:43 crc kubenswrapper[4763]: E0930 15:02:43.112694 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110dc091-c67e-40ca-af69-d37995b5c65f" containerName="init" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.112709 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="110dc091-c67e-40ca-af69-d37995b5c65f" containerName="init" Sep 30 15:02:43 crc kubenswrapper[4763]: E0930 15:02:43.112731 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b793587a-a139-454e-9837-a388a88f9129" containerName="keystone-bootstrap" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.112738 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b793587a-a139-454e-9837-a388a88f9129" containerName="keystone-bootstrap" Sep 30 15:02:43 crc kubenswrapper[4763]: E0930 15:02:43.112760 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110dc091-c67e-40ca-af69-d37995b5c65f" containerName="dnsmasq-dns" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.112766 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="110dc091-c67e-40ca-af69-d37995b5c65f" containerName="dnsmasq-dns" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.112913 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="110dc091-c67e-40ca-af69-d37995b5c65f" containerName="dnsmasq-dns" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.112939 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b793587a-a139-454e-9837-a388a88f9129" containerName="keystone-bootstrap" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.113503 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.115220 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rkqp7" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.115627 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.115633 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.115634 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.124471 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c84dcf974-nst5l"] Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.208046 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-combined-ca-bundle\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.208120 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv88w\" (UniqueName: \"kubernetes.io/projected/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-kube-api-access-kv88w\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.208159 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-credential-keys\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.208196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-fernet-keys\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.208226 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-config-data\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.208266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-scripts\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.309783 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-scripts\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.309880 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-combined-ca-bundle\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.309906 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv88w\" (UniqueName: \"kubernetes.io/projected/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-kube-api-access-kv88w\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.309936 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-credential-keys\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.309972 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-fernet-keys\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.309997 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-config-data\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.314912 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-combined-ca-bundle\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.315167 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-fernet-keys\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.315637 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-scripts\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.319826 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-credential-keys\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.320154 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-config-data\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.340345 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv88w\" (UniqueName: \"kubernetes.io/projected/8cdc6a79-868f-4c99-b08a-f9c740ca17a3-kube-api-access-kv88w\") pod \"keystone-c84dcf974-nst5l\" (UID: \"8cdc6a79-868f-4c99-b08a-f9c740ca17a3\") " pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.431879 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.875486 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c84dcf974-nst5l"] Sep 30 15:02:43 crc kubenswrapper[4763]: I0930 15:02:43.946181 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c84dcf974-nst5l" event={"ID":"8cdc6a79-868f-4c99-b08a-f9c740ca17a3","Type":"ContainerStarted","Data":"1c42b11d9c946fadeb7d27385b062a1eae00ce2b48f3810a2924a97323b27393"} Sep 30 15:02:44 crc kubenswrapper[4763]: I0930 15:02:44.956437 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c84dcf974-nst5l" event={"ID":"8cdc6a79-868f-4c99-b08a-f9c740ca17a3","Type":"ContainerStarted","Data":"4983eae6e8fb3858cf9d3d60567726fcc82b908781fdc741ae7daef5840c0147"} Sep 30 15:02:44 crc kubenswrapper[4763]: I0930 15:02:44.958080 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:02:44 crc kubenswrapper[4763]: I0930 15:02:44.980474 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c84dcf974-nst5l" podStartSLOduration=1.9804448959999998 podStartE2EDuration="1.980444896s" podCreationTimestamp="2025-09-30 15:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:02:44.978158938 +0000 UTC m=+5237.116719223" watchObservedRunningTime="2025-09-30 15:02:44.980444896 +0000 UTC m=+5237.119005201" Sep 30 15:02:46 crc kubenswrapper[4763]: I0930 15:02:46.490060 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:02:46 crc kubenswrapper[4763]: I0930 15:02:46.974989 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"250f87b86eee2fc155a67416ec0df43385031426a4b801bb824303cda0afd8fa"} Sep 30 15:03:03 crc kubenswrapper[4763]: I0930 15:03:03.863623 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g5mnp"] Sep 30 15:03:03 crc kubenswrapper[4763]: I0930 15:03:03.866745 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:03 crc kubenswrapper[4763]: I0930 15:03:03.876930 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5mnp"] Sep 30 15:03:03 crc kubenswrapper[4763]: I0930 15:03:03.957468 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-utilities\") pod \"redhat-marketplace-g5mnp\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:03 crc kubenswrapper[4763]: I0930 15:03:03.957641 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66g7\" (UniqueName: \"kubernetes.io/projected/1f9b2b49-4a50-4691-a288-d9845748e663-kube-api-access-t66g7\") pod \"redhat-marketplace-g5mnp\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:03 crc kubenswrapper[4763]: I0930 15:03:03.957684 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-catalog-content\") pod \"redhat-marketplace-g5mnp\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:04 crc kubenswrapper[4763]: I0930 15:03:04.059156 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66g7\" (UniqueName: \"kubernetes.io/projected/1f9b2b49-4a50-4691-a288-d9845748e663-kube-api-access-t66g7\") pod \"redhat-marketplace-g5mnp\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:04 crc kubenswrapper[4763]: I0930 15:03:04.059545 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-catalog-content\") pod \"redhat-marketplace-g5mnp\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:04 crc kubenswrapper[4763]: I0930 15:03:04.059708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-utilities\") pod \"redhat-marketplace-g5mnp\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:04 crc kubenswrapper[4763]: I0930 15:03:04.060138 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-catalog-content\") pod \"redhat-marketplace-g5mnp\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:04 crc kubenswrapper[4763]: I0930 15:03:04.060218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-utilities\") pod \"redhat-marketplace-g5mnp\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:04 crc kubenswrapper[4763]: I0930 15:03:04.084996 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66g7\" (UniqueName: \"kubernetes.io/projected/1f9b2b49-4a50-4691-a288-d9845748e663-kube-api-access-t66g7\") pod \"redhat-marketplace-g5mnp\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:04 crc kubenswrapper[4763]: I0930 15:03:04.201838 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:04 crc kubenswrapper[4763]: I0930 15:03:04.640990 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5mnp"] Sep 30 15:03:04 crc kubenswrapper[4763]: W0930 15:03:04.645868 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f9b2b49_4a50_4691_a288_d9845748e663.slice/crio-8e971c399e5c157f2bb486a0edbfb64c7b5fff190c02b5a84bf39e0ab8c18e3c WatchSource:0}: Error finding container 8e971c399e5c157f2bb486a0edbfb64c7b5fff190c02b5a84bf39e0ab8c18e3c: Status 404 returned error can't find the container with id 8e971c399e5c157f2bb486a0edbfb64c7b5fff190c02b5a84bf39e0ab8c18e3c Sep 30 15:03:05 crc kubenswrapper[4763]: I0930 15:03:05.121544 4763 generic.go:334] "Generic (PLEG): container finished" podID="1f9b2b49-4a50-4691-a288-d9845748e663" containerID="4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39" exitCode=0 Sep 30 15:03:05 crc kubenswrapper[4763]: I0930 15:03:05.121612 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5mnp" event={"ID":"1f9b2b49-4a50-4691-a288-d9845748e663","Type":"ContainerDied","Data":"4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39"} Sep 30 15:03:05 crc kubenswrapper[4763]: I0930 15:03:05.121874 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5mnp" event={"ID":"1f9b2b49-4a50-4691-a288-d9845748e663","Type":"ContainerStarted","Data":"8e971c399e5c157f2bb486a0edbfb64c7b5fff190c02b5a84bf39e0ab8c18e3c"} Sep 30 15:03:07 crc kubenswrapper[4763]: I0930 15:03:07.144433 4763 generic.go:334] "Generic (PLEG): container finished" podID="1f9b2b49-4a50-4691-a288-d9845748e663" containerID="7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b" exitCode=0 Sep 30 15:03:07 crc kubenswrapper[4763]: I0930 15:03:07.144539 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5mnp" event={"ID":"1f9b2b49-4a50-4691-a288-d9845748e663","Type":"ContainerDied","Data":"7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b"} Sep 30 15:03:08 crc kubenswrapper[4763]: I0930 15:03:08.158124 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5mnp" event={"ID":"1f9b2b49-4a50-4691-a288-d9845748e663","Type":"ContainerStarted","Data":"45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33"} Sep 30 15:03:08 crc kubenswrapper[4763]: I0930 15:03:08.180704 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g5mnp" podStartSLOduration=2.408471976 podStartE2EDuration="5.18068692s" podCreationTimestamp="2025-09-30 15:03:03 +0000 UTC" firstStartedPulling="2025-09-30 15:03:05.123461227 +0000 UTC m=+5257.262021522" lastFinishedPulling="2025-09-30 15:03:07.895676181 +0000 UTC m=+5260.034236466" observedRunningTime="2025-09-30 15:03:08.175853119 +0000 UTC m=+5260.314413434" watchObservedRunningTime="2025-09-30 15:03:08.18068692 +0000 UTC m=+5260.319247205" Sep 30 15:03:14 crc kubenswrapper[4763]: I0930 15:03:14.202785 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:14 crc kubenswrapper[4763]: I0930 15:03:14.203396 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:14 crc kubenswrapper[4763]: I0930 15:03:14.278310 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:14 crc kubenswrapper[4763]: I0930 15:03:14.327794 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:14 crc kubenswrapper[4763]: I0930 15:03:14.521881 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5mnp"] Sep 30 15:03:15 crc kubenswrapper[4763]: I0930 15:03:15.036875 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-c84dcf974-nst5l" Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.232650 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g5mnp" podUID="1f9b2b49-4a50-4691-a288-d9845748e663" containerName="registry-server" containerID="cri-o://45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33" gracePeriod=2 Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.632454 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.688097 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-catalog-content\") pod \"1f9b2b49-4a50-4691-a288-d9845748e663\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.688291 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t66g7\" (UniqueName: \"kubernetes.io/projected/1f9b2b49-4a50-4691-a288-d9845748e663-kube-api-access-t66g7\") pod \"1f9b2b49-4a50-4691-a288-d9845748e663\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.688344 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-utilities\") pod \"1f9b2b49-4a50-4691-a288-d9845748e663\" (UID: \"1f9b2b49-4a50-4691-a288-d9845748e663\") " Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.689475 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-utilities" (OuterVolumeSpecName: "utilities") pod "1f9b2b49-4a50-4691-a288-d9845748e663" (UID: "1f9b2b49-4a50-4691-a288-d9845748e663"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.693653 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f9b2b49-4a50-4691-a288-d9845748e663-kube-api-access-t66g7" (OuterVolumeSpecName: "kube-api-access-t66g7") pod "1f9b2b49-4a50-4691-a288-d9845748e663" (UID: "1f9b2b49-4a50-4691-a288-d9845748e663"). InnerVolumeSpecName "kube-api-access-t66g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.708165 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f9b2b49-4a50-4691-a288-d9845748e663" (UID: "1f9b2b49-4a50-4691-a288-d9845748e663"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.790463 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.790514 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t66g7\" (UniqueName: \"kubernetes.io/projected/1f9b2b49-4a50-4691-a288-d9845748e663-kube-api-access-t66g7\") on node \"crc\" DevicePath \"\"" Sep 30 15:03:16 crc kubenswrapper[4763]: I0930 15:03:16.790528 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9b2b49-4a50-4691-a288-d9845748e663-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.242767 4763 generic.go:334] "Generic (PLEG): container finished" podID="1f9b2b49-4a50-4691-a288-d9845748e663" containerID="45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33" exitCode=0 Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.242831 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5mnp" event={"ID":"1f9b2b49-4a50-4691-a288-d9845748e663","Type":"ContainerDied","Data":"45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33"} Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.242852 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5mnp" Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.243036 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5mnp" event={"ID":"1f9b2b49-4a50-4691-a288-d9845748e663","Type":"ContainerDied","Data":"8e971c399e5c157f2bb486a0edbfb64c7b5fff190c02b5a84bf39e0ab8c18e3c"} Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.243058 4763 scope.go:117] "RemoveContainer" containerID="45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33" Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.275147 4763 scope.go:117] "RemoveContainer" containerID="7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b" Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.279064 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5mnp"] Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.287177 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5mnp"] Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.306164 4763 scope.go:117] "RemoveContainer" containerID="4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39" Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.340580 4763 scope.go:117] "RemoveContainer" containerID="45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33" Sep 30 15:03:17 crc kubenswrapper[4763]: E0930 15:03:17.341141 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33\": container with ID starting with 45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33 not found: ID does not exist" containerID="45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33" Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.341194 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33"} err="failed to get container status \"45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33\": rpc error: code = NotFound desc = could not find container \"45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33\": container with ID starting with 45baf854d71969fc2ef38917f9e69a74d1891b6903a3ad1cffeea2914f1bea33 not found: ID does not exist" Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.341225 4763 scope.go:117] "RemoveContainer" containerID="7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b" Sep 30 15:03:17 crc kubenswrapper[4763]: E0930 15:03:17.341532 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b\": container with ID starting with 7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b not found: ID does not exist" containerID="7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b" Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.341573 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b"} err="failed to get container status \"7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b\": rpc error: code = NotFound desc = could not find container \"7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b\": container with ID starting with 7d535f634ada87f95a692ea3ffd93a92f07b66d674665cef5797a102fec0930b not found: ID does not exist" Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.341815 4763 scope.go:117] "RemoveContainer" containerID="4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39" Sep 30 15:03:17 crc kubenswrapper[4763]: E0930 15:03:17.342131 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39\": container with ID starting with 4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39 not found: ID does not exist" containerID="4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39" Sep 30 15:03:17 crc kubenswrapper[4763]: I0930 15:03:17.342162 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39"} err="failed to get container status \"4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39\": rpc error: code = NotFound desc = could not find container \"4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39\": container with ID starting with 4303e4c9431c4e90edf1e15644ce7264647a3779a35f7f7096c7cfc88048eb39 not found: ID does not exist" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.510032 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f9b2b49-4a50-4691-a288-d9845748e663" path="/var/lib/kubelet/pods/1f9b2b49-4a50-4691-a288-d9845748e663/volumes" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.769541 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 15:03:18 crc kubenswrapper[4763]: E0930 15:03:18.770295 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9b2b49-4a50-4691-a288-d9845748e663" containerName="registry-server" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.770434 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9b2b49-4a50-4691-a288-d9845748e663" containerName="registry-server" Sep 30 15:03:18 crc kubenswrapper[4763]: E0930 15:03:18.770558 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9b2b49-4a50-4691-a288-d9845748e663" containerName="extract-content" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.770663 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9b2b49-4a50-4691-a288-d9845748e663" containerName="extract-content" Sep 30 15:03:18 crc kubenswrapper[4763]: E0930 15:03:18.770743 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9b2b49-4a50-4691-a288-d9845748e663" containerName="extract-utilities" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.770800 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9b2b49-4a50-4691-a288-d9845748e663" containerName="extract-utilities" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.771019 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9b2b49-4a50-4691-a288-d9845748e663" containerName="registry-server" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.771716 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.773921 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wwpt8" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.774040 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.775226 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.788202 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.826445 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcxw9\" (UniqueName: \"kubernetes.io/projected/7349e1a5-d644-4dd3-a6a9-4a132e7782db-kube-api-access-zcxw9\") pod \"openstackclient\" (UID: \"7349e1a5-d644-4dd3-a6a9-4a132e7782db\") " pod="openstack/openstackclient" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.826522 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7349e1a5-d644-4dd3-a6a9-4a132e7782db-openstack-config\") pod \"openstackclient\" (UID: \"7349e1a5-d644-4dd3-a6a9-4a132e7782db\") " pod="openstack/openstackclient" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.826573 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7349e1a5-d644-4dd3-a6a9-4a132e7782db-openstack-config-secret\") pod \"openstackclient\" (UID: \"7349e1a5-d644-4dd3-a6a9-4a132e7782db\") " pod="openstack/openstackclient" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.927995 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcxw9\" (UniqueName: \"kubernetes.io/projected/7349e1a5-d644-4dd3-a6a9-4a132e7782db-kube-api-access-zcxw9\") pod \"openstackclient\" (UID: \"7349e1a5-d644-4dd3-a6a9-4a132e7782db\") " pod="openstack/openstackclient" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.928091 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7349e1a5-d644-4dd3-a6a9-4a132e7782db-openstack-config\") pod \"openstackclient\" (UID: \"7349e1a5-d644-4dd3-a6a9-4a132e7782db\") " pod="openstack/openstackclient" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.928150 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7349e1a5-d644-4dd3-a6a9-4a132e7782db-openstack-config-secret\") pod \"openstackclient\" (UID: \"7349e1a5-d644-4dd3-a6a9-4a132e7782db\") " pod="openstack/openstackclient" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.929477 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7349e1a5-d644-4dd3-a6a9-4a132e7782db-openstack-config\") pod \"openstackclient\" (UID: \"7349e1a5-d644-4dd3-a6a9-4a132e7782db\") " pod="openstack/openstackclient" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.942319 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7349e1a5-d644-4dd3-a6a9-4a132e7782db-openstack-config-secret\") pod \"openstackclient\" (UID: \"7349e1a5-d644-4dd3-a6a9-4a132e7782db\") " pod="openstack/openstackclient" Sep 30 15:03:18 crc kubenswrapper[4763]: I0930 15:03:18.945523 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcxw9\" (UniqueName: \"kubernetes.io/projected/7349e1a5-d644-4dd3-a6a9-4a132e7782db-kube-api-access-zcxw9\") pod \"openstackclient\" (UID: \"7349e1a5-d644-4dd3-a6a9-4a132e7782db\") " pod="openstack/openstackclient" Sep 30 15:03:19 crc kubenswrapper[4763]: I0930 15:03:19.094826 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 15:03:19 crc kubenswrapper[4763]: I0930 15:03:19.510037 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 15:03:20 crc kubenswrapper[4763]: I0930 15:03:20.270030 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7349e1a5-d644-4dd3-a6a9-4a132e7782db","Type":"ContainerStarted","Data":"9c97aa5f4901abe1f0cb45d3c811251e3d01c9819fee24b51402cb118cb7304d"} Sep 30 15:03:20 crc kubenswrapper[4763]: I0930 15:03:20.270335 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7349e1a5-d644-4dd3-a6a9-4a132e7782db","Type":"ContainerStarted","Data":"99155511c6641cf0eb205618dd94ae7070c1381fdff93a040e03d3295f64e7e5"} Sep 30 15:03:20 crc kubenswrapper[4763]: I0930 15:03:20.297526 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.297504949 podStartE2EDuration="2.297504949s" podCreationTimestamp="2025-09-30 15:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:03:20.284823603 +0000 UTC m=+5272.423383908" watchObservedRunningTime="2025-09-30 15:03:20.297504949 +0000 UTC m=+5272.436065254" Sep 30 15:05:06 crc kubenswrapper[4763]: I0930 15:05:06.060153 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:05:06 crc kubenswrapper[4763]: I0930 15:05:06.060721 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:05:36 crc kubenswrapper[4763]: I0930 15:05:36.060049 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:05:36 crc kubenswrapper[4763]: I0930 15:05:36.062746 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:05:57 crc kubenswrapper[4763]: I0930 15:05:57.809080 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-68kkj/must-gather-dmdrl"] Sep 30 15:05:57 crc kubenswrapper[4763]: I0930 15:05:57.810886 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/must-gather-dmdrl" Sep 30 15:05:57 crc kubenswrapper[4763]: I0930 15:05:57.813907 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-68kkj"/"openshift-service-ca.crt" Sep 30 15:05:57 crc kubenswrapper[4763]: I0930 15:05:57.814121 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-68kkj"/"kube-root-ca.crt" Sep 30 15:05:57 crc kubenswrapper[4763]: I0930 15:05:57.814484 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-68kkj"/"default-dockercfg-bzkmd" Sep 30 15:05:57 crc kubenswrapper[4763]: I0930 15:05:57.820809 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-68kkj/must-gather-dmdrl"] Sep 30 15:05:57 crc kubenswrapper[4763]: I0930 15:05:57.928422 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd552\" (UniqueName: \"kubernetes.io/projected/00acb94e-4228-4d1d-9f74-1856acbc9d71-kube-api-access-wd552\") pod \"must-gather-dmdrl\" (UID: \"00acb94e-4228-4d1d-9f74-1856acbc9d71\") " pod="openshift-must-gather-68kkj/must-gather-dmdrl" Sep 30 15:05:57 crc kubenswrapper[4763]: I0930 15:05:57.928464 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/00acb94e-4228-4d1d-9f74-1856acbc9d71-must-gather-output\") pod \"must-gather-dmdrl\" (UID: \"00acb94e-4228-4d1d-9f74-1856acbc9d71\") " pod="openshift-must-gather-68kkj/must-gather-dmdrl" Sep 30 15:05:58 crc kubenswrapper[4763]: I0930 15:05:58.030071 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd552\" (UniqueName: \"kubernetes.io/projected/00acb94e-4228-4d1d-9f74-1856acbc9d71-kube-api-access-wd552\") pod \"must-gather-dmdrl\" (UID: \"00acb94e-4228-4d1d-9f74-1856acbc9d71\") " pod="openshift-must-gather-68kkj/must-gather-dmdrl" Sep 30 15:05:58 crc kubenswrapper[4763]: I0930 15:05:58.030117 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/00acb94e-4228-4d1d-9f74-1856acbc9d71-must-gather-output\") pod \"must-gather-dmdrl\" (UID: \"00acb94e-4228-4d1d-9f74-1856acbc9d71\") " pod="openshift-must-gather-68kkj/must-gather-dmdrl" Sep 30 15:05:58 crc kubenswrapper[4763]: I0930 15:05:58.030568 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/00acb94e-4228-4d1d-9f74-1856acbc9d71-must-gather-output\") pod \"must-gather-dmdrl\" (UID: \"00acb94e-4228-4d1d-9f74-1856acbc9d71\") " pod="openshift-must-gather-68kkj/must-gather-dmdrl" Sep 30 15:05:58 crc kubenswrapper[4763]: I0930 15:05:58.049035 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd552\" (UniqueName: \"kubernetes.io/projected/00acb94e-4228-4d1d-9f74-1856acbc9d71-kube-api-access-wd552\") pod \"must-gather-dmdrl\" (UID: \"00acb94e-4228-4d1d-9f74-1856acbc9d71\") " pod="openshift-must-gather-68kkj/must-gather-dmdrl" Sep 30 15:05:58 crc kubenswrapper[4763]: I0930 15:05:58.131591 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/must-gather-dmdrl" Sep 30 15:05:58 crc kubenswrapper[4763]: I0930 15:05:58.611663 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-68kkj/must-gather-dmdrl"] Sep 30 15:05:58 crc kubenswrapper[4763]: W0930 15:05:58.615332 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00acb94e_4228_4d1d_9f74_1856acbc9d71.slice/crio-ab5914efc06ca547ec23c739591c6fc690bd641ef4c4b3f18f2c5a402f3ed3ca WatchSource:0}: Error finding container ab5914efc06ca547ec23c739591c6fc690bd641ef4c4b3f18f2c5a402f3ed3ca: Status 404 returned error can't find the container with id ab5914efc06ca547ec23c739591c6fc690bd641ef4c4b3f18f2c5a402f3ed3ca Sep 30 15:05:58 crc kubenswrapper[4763]: I0930 15:05:58.617874 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 15:05:59 crc kubenswrapper[4763]: I0930 15:05:59.644453 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/must-gather-dmdrl" event={"ID":"00acb94e-4228-4d1d-9f74-1856acbc9d71","Type":"ContainerStarted","Data":"ab5914efc06ca547ec23c739591c6fc690bd641ef4c4b3f18f2c5a402f3ed3ca"} Sep 30 15:06:03 crc kubenswrapper[4763]: I0930 15:06:03.679914 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/must-gather-dmdrl" event={"ID":"00acb94e-4228-4d1d-9f74-1856acbc9d71","Type":"ContainerStarted","Data":"c419163f3556c86546e46b86dde4b76a3cc62d84da4e3d9ef1f6ebb0db00133e"} Sep 30 15:06:03 crc kubenswrapper[4763]: I0930 15:06:03.681651 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/must-gather-dmdrl" event={"ID":"00acb94e-4228-4d1d-9f74-1856acbc9d71","Type":"ContainerStarted","Data":"fd447ad5c6030202550f7eaef684f6dd7b437341fee2afff64869eb8dde8ca2b"} Sep 30 15:06:03 crc kubenswrapper[4763]: I0930 15:06:03.720676 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-68kkj/must-gather-dmdrl" podStartSLOduration=2.765212726 podStartE2EDuration="6.720646506s" podCreationTimestamp="2025-09-30 15:05:57 +0000 UTC" firstStartedPulling="2025-09-30 15:05:58.617617267 +0000 UTC m=+5430.756177552" lastFinishedPulling="2025-09-30 15:06:02.573051047 +0000 UTC m=+5434.711611332" observedRunningTime="2025-09-30 15:06:03.716319988 +0000 UTC m=+5435.854880293" watchObservedRunningTime="2025-09-30 15:06:03.720646506 +0000 UTC m=+5435.859206811" Sep 30 15:06:05 crc kubenswrapper[4763]: I0930 15:06:05.487791 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-68kkj/crc-debug-j5hmp"] Sep 30 15:06:05 crc kubenswrapper[4763]: I0930 15:06:05.489668 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-j5hmp" Sep 30 15:06:05 crc kubenswrapper[4763]: I0930 15:06:05.560261 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6srx\" (UniqueName: \"kubernetes.io/projected/56961136-1a3f-43ee-a2ea-c69440273759-kube-api-access-m6srx\") pod \"crc-debug-j5hmp\" (UID: \"56961136-1a3f-43ee-a2ea-c69440273759\") " pod="openshift-must-gather-68kkj/crc-debug-j5hmp" Sep 30 15:06:05 crc kubenswrapper[4763]: I0930 15:06:05.560364 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56961136-1a3f-43ee-a2ea-c69440273759-host\") pod \"crc-debug-j5hmp\" (UID: \"56961136-1a3f-43ee-a2ea-c69440273759\") " pod="openshift-must-gather-68kkj/crc-debug-j5hmp" Sep 30 15:06:05 crc kubenswrapper[4763]: I0930 15:06:05.661754 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6srx\" (UniqueName: \"kubernetes.io/projected/56961136-1a3f-43ee-a2ea-c69440273759-kube-api-access-m6srx\") pod \"crc-debug-j5hmp\" (UID: \"56961136-1a3f-43ee-a2ea-c69440273759\") " pod="openshift-must-gather-68kkj/crc-debug-j5hmp" Sep 30 15:06:05 crc kubenswrapper[4763]: I0930 15:06:05.661849 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56961136-1a3f-43ee-a2ea-c69440273759-host\") pod \"crc-debug-j5hmp\" (UID: \"56961136-1a3f-43ee-a2ea-c69440273759\") " pod="openshift-must-gather-68kkj/crc-debug-j5hmp" Sep 30 15:06:05 crc kubenswrapper[4763]: I0930 15:06:05.661983 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56961136-1a3f-43ee-a2ea-c69440273759-host\") pod \"crc-debug-j5hmp\" (UID: \"56961136-1a3f-43ee-a2ea-c69440273759\") " pod="openshift-must-gather-68kkj/crc-debug-j5hmp" Sep 30 15:06:05 crc kubenswrapper[4763]: I0930 15:06:05.691822 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6srx\" (UniqueName: \"kubernetes.io/projected/56961136-1a3f-43ee-a2ea-c69440273759-kube-api-access-m6srx\") pod \"crc-debug-j5hmp\" (UID: \"56961136-1a3f-43ee-a2ea-c69440273759\") " pod="openshift-must-gather-68kkj/crc-debug-j5hmp" Sep 30 15:06:05 crc kubenswrapper[4763]: I0930 15:06:05.810266 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-j5hmp" Sep 30 15:06:06 crc kubenswrapper[4763]: I0930 15:06:06.060085 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:06:06 crc kubenswrapper[4763]: I0930 15:06:06.060460 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:06:06 crc kubenswrapper[4763]: I0930 15:06:06.060516 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 15:06:06 crc kubenswrapper[4763]: I0930 15:06:06.061254 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"250f87b86eee2fc155a67416ec0df43385031426a4b801bb824303cda0afd8fa"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 15:06:06 crc kubenswrapper[4763]: I0930 15:06:06.061343 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://250f87b86eee2fc155a67416ec0df43385031426a4b801bb824303cda0afd8fa" gracePeriod=600 Sep 30 15:06:06 crc kubenswrapper[4763]: I0930 15:06:06.712328 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="250f87b86eee2fc155a67416ec0df43385031426a4b801bb824303cda0afd8fa" exitCode=0 Sep 30 15:06:06 crc kubenswrapper[4763]: I0930 15:06:06.712436 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"250f87b86eee2fc155a67416ec0df43385031426a4b801bb824303cda0afd8fa"} Sep 30 15:06:06 crc kubenswrapper[4763]: I0930 15:06:06.712646 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44"} Sep 30 15:06:06 crc kubenswrapper[4763]: I0930 15:06:06.712667 4763 scope.go:117] "RemoveContainer" containerID="fb1ff904771791e6c5100be28d3414d21b5501e70add28a6456bc58a01e424e3" Sep 30 15:06:06 crc kubenswrapper[4763]: I0930 15:06:06.716257 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/crc-debug-j5hmp" event={"ID":"56961136-1a3f-43ee-a2ea-c69440273759","Type":"ContainerStarted","Data":"07f0f5d6dc2fec9fc3f58b4ded864c380459629d8c7fe62985d763fb81b9f925"} Sep 30 15:06:16 crc kubenswrapper[4763]: I0930 15:06:16.809506 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/crc-debug-j5hmp" event={"ID":"56961136-1a3f-43ee-a2ea-c69440273759","Type":"ContainerStarted","Data":"29820bb2fea2a4f2772a6fad3453630131caca36bb8a2fad0115767f294b169c"} Sep 30 15:06:16 crc kubenswrapper[4763]: I0930 15:06:16.826765 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-68kkj/crc-debug-j5hmp" podStartSLOduration=1.6243366369999999 podStartE2EDuration="11.826743502s" podCreationTimestamp="2025-09-30 15:06:05 +0000 UTC" firstStartedPulling="2025-09-30 15:06:05.844846059 +0000 UTC m=+5437.983406354" lastFinishedPulling="2025-09-30 15:06:16.047252924 +0000 UTC m=+5448.185813219" observedRunningTime="2025-09-30 15:06:16.820891276 +0000 UTC m=+5448.959451561" watchObservedRunningTime="2025-09-30 15:06:16.826743502 +0000 UTC m=+5448.965303797" Sep 30 15:06:34 crc kubenswrapper[4763]: I0930 15:06:34.734759 4763 scope.go:117] "RemoveContainer" containerID="08e4ef0ff1c31b2201a8788992884ff64f6030d207f2f9fbceb7eb99ca10448f" Sep 30 15:06:46 crc kubenswrapper[4763]: I0930 15:06:46.287751 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5dcb497f8f-c9md6_e339af44-3091-43cf-97d5-f0ea9f55a33d/init/0.log" Sep 30 15:06:46 crc kubenswrapper[4763]: I0930 15:06:46.496152 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5dcb497f8f-c9md6_e339af44-3091-43cf-97d5-f0ea9f55a33d/init/0.log" Sep 30 15:06:46 crc kubenswrapper[4763]: I0930 15:06:46.521395 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5dcb497f8f-c9md6_e339af44-3091-43cf-97d5-f0ea9f55a33d/dnsmasq-dns/0.log" Sep 30 15:06:46 crc kubenswrapper[4763]: I0930 15:06:46.726988 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-33f9-account-create-x9dxr_e6579aae-8365-4154-bc8b-c5af7b342ebb/mariadb-account-create/0.log" Sep 30 15:06:46 crc kubenswrapper[4763]: I0930 15:06:46.822770 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-txslw_b793587a-a139-454e-9837-a388a88f9129/keystone-bootstrap/0.log" Sep 30 15:06:46 crc kubenswrapper[4763]: I0930 15:06:46.998159 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c84dcf974-nst5l_8cdc6a79-868f-4c99-b08a-f9c740ca17a3/keystone-api/0.log" Sep 30 15:06:47 crc kubenswrapper[4763]: I0930 15:06:47.174349 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-hlsjx_be97e602-d2d5-4dde-b3fc-3f013fabf8fc/mariadb-database-create/0.log" Sep 30 15:06:47 crc kubenswrapper[4763]: I0930 15:06:47.254671 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-wqkxw_ae0d63a0-c59a-4eb2-ae24-b65200d012b8/keystone-db-sync/0.log" Sep 30 15:06:47 crc kubenswrapper[4763]: I0930 15:06:47.386576 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_67af32f0-7954-4054-a5c0-cdb6da84d408/adoption/0.log" Sep 30 15:06:47 crc kubenswrapper[4763]: I0930 15:06:47.587948 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_66aa844f-2e00-40de-9084-1f68e6742ab1/memcached/0.log" Sep 30 15:06:47 crc kubenswrapper[4763]: I0930 15:06:47.727255 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561/mysql-bootstrap/0.log" Sep 30 15:06:47 crc kubenswrapper[4763]: I0930 15:06:47.966915 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561/mysql-bootstrap/0.log" Sep 30 15:06:48 crc kubenswrapper[4763]: I0930 15:06:48.012228 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46dcd4ce-0d5a-4e4d-8ccd-dc604f22c561/galera/0.log" Sep 30 15:06:48 crc kubenswrapper[4763]: I0930 15:06:48.171288 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_63a8ad25-874a-4688-a6cb-abfac16910a3/mysql-bootstrap/0.log" Sep 30 15:06:48 crc kubenswrapper[4763]: I0930 15:06:48.385810 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_63a8ad25-874a-4688-a6cb-abfac16910a3/mysql-bootstrap/0.log" Sep 30 15:06:48 crc kubenswrapper[4763]: I0930 15:06:48.396878 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_63a8ad25-874a-4688-a6cb-abfac16910a3/galera/0.log" Sep 30 15:06:48 crc kubenswrapper[4763]: I0930 15:06:48.620138 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7349e1a5-d644-4dd3-a6a9-4a132e7782db/openstackclient/0.log" Sep 30 15:06:48 crc kubenswrapper[4763]: I0930 15:06:48.631862 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_e7101812-e158-46b0-af76-063835526dc4/adoption/0.log" Sep 30 15:06:48 crc kubenswrapper[4763]: I0930 15:06:48.804683 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4f76f17e-9e81-4e2a-b3a1-428e4e54972d/openstack-network-exporter/0.log" Sep 30 15:06:48 crc kubenswrapper[4763]: I0930 15:06:48.832957 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4f76f17e-9e81-4e2a-b3a1-428e4e54972d/ovn-northd/0.log" Sep 30 15:06:49 crc kubenswrapper[4763]: I0930 15:06:49.246434 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2b82f57c-bca5-4fad-949d-13d9fdf45d62/openstack-network-exporter/0.log" Sep 30 15:06:49 crc kubenswrapper[4763]: I0930 15:06:49.296036 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2b82f57c-bca5-4fad-949d-13d9fdf45d62/ovsdbserver-nb/0.log" Sep 30 15:06:49 crc kubenswrapper[4763]: I0930 15:06:49.438502 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7b20faf9-26ae-48bf-9293-541e5b2c3468/openstack-network-exporter/0.log" Sep 30 15:06:49 crc kubenswrapper[4763]: I0930 15:06:49.506996 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7b20faf9-26ae-48bf-9293-541e5b2c3468/ovsdbserver-nb/0.log" Sep 30 15:06:49 crc kubenswrapper[4763]: I0930 15:06:49.686928 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_d44ff345-47d2-4e11-bb92-1b3e00eaba74/openstack-network-exporter/0.log" Sep 30 15:06:49 crc kubenswrapper[4763]: I0930 15:06:49.725065 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_d44ff345-47d2-4e11-bb92-1b3e00eaba74/ovsdbserver-nb/0.log" Sep 30 15:06:49 crc kubenswrapper[4763]: I0930 15:06:49.868170 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1252ad03-af5a-458c-a660-74b5389d2f50/openstack-network-exporter/0.log" Sep 30 15:06:49 crc kubenswrapper[4763]: I0930 15:06:49.920191 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1252ad03-af5a-458c-a660-74b5389d2f50/ovsdbserver-sb/0.log" Sep 30 15:06:50 crc kubenswrapper[4763]: I0930 15:06:50.057713 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_746fc1ac-b115-4114-a611-4b2c18e779d3/openstack-network-exporter/0.log" Sep 30 15:06:50 crc kubenswrapper[4763]: I0930 15:06:50.111860 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_746fc1ac-b115-4114-a611-4b2c18e779d3/ovsdbserver-sb/0.log" Sep 30 15:06:50 crc kubenswrapper[4763]: I0930 15:06:50.183570 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_0e9670da-68b7-4ec2-ada3-51c74cabd937/openstack-network-exporter/0.log" Sep 30 15:06:50 crc kubenswrapper[4763]: I0930 15:06:50.272215 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_0e9670da-68b7-4ec2-ada3-51c74cabd937/ovsdbserver-sb/0.log" Sep 30 15:06:50 crc kubenswrapper[4763]: I0930 15:06:50.404307 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207/setup-container/0.log" Sep 30 15:06:50 crc kubenswrapper[4763]: I0930 15:06:50.548381 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207/setup-container/0.log" Sep 30 15:06:50 crc kubenswrapper[4763]: I0930 15:06:50.616377 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f6a0ad5b-1256-433d-b04a-aa120b250440/setup-container/0.log" Sep 30 15:06:50 crc kubenswrapper[4763]: I0930 15:06:50.644181 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d00e1ec3-beb8-42f5-b5e1-6dfc67bcb207/rabbitmq/0.log" Sep 30 15:06:50 crc kubenswrapper[4763]: I0930 15:06:50.820665 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f6a0ad5b-1256-433d-b04a-aa120b250440/setup-container/0.log" Sep 30 15:06:50 crc kubenswrapper[4763]: I0930 15:06:50.850360 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f6a0ad5b-1256-433d-b04a-aa120b250440/rabbitmq/0.log" Sep 30 15:07:42 crc kubenswrapper[4763]: I0930 15:07:42.508575 4763 generic.go:334] "Generic (PLEG): container finished" podID="56961136-1a3f-43ee-a2ea-c69440273759" containerID="29820bb2fea2a4f2772a6fad3453630131caca36bb8a2fad0115767f294b169c" exitCode=0 Sep 30 15:07:42 crc kubenswrapper[4763]: I0930 15:07:42.508674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/crc-debug-j5hmp" event={"ID":"56961136-1a3f-43ee-a2ea-c69440273759","Type":"ContainerDied","Data":"29820bb2fea2a4f2772a6fad3453630131caca36bb8a2fad0115767f294b169c"} Sep 30 15:07:43 crc kubenswrapper[4763]: I0930 15:07:43.596421 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-j5hmp" Sep 30 15:07:43 crc kubenswrapper[4763]: I0930 15:07:43.628013 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-68kkj/crc-debug-j5hmp"] Sep 30 15:07:43 crc kubenswrapper[4763]: I0930 15:07:43.634491 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-68kkj/crc-debug-j5hmp"] Sep 30 15:07:43 crc kubenswrapper[4763]: I0930 15:07:43.659654 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6srx\" (UniqueName: \"kubernetes.io/projected/56961136-1a3f-43ee-a2ea-c69440273759-kube-api-access-m6srx\") pod \"56961136-1a3f-43ee-a2ea-c69440273759\" (UID: \"56961136-1a3f-43ee-a2ea-c69440273759\") " Sep 30 15:07:43 crc kubenswrapper[4763]: I0930 15:07:43.659891 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56961136-1a3f-43ee-a2ea-c69440273759-host\") pod \"56961136-1a3f-43ee-a2ea-c69440273759\" (UID: \"56961136-1a3f-43ee-a2ea-c69440273759\") " Sep 30 15:07:43 crc kubenswrapper[4763]: I0930 15:07:43.660045 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56961136-1a3f-43ee-a2ea-c69440273759-host" (OuterVolumeSpecName: "host") pod "56961136-1a3f-43ee-a2ea-c69440273759" (UID: "56961136-1a3f-43ee-a2ea-c69440273759"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:07:43 crc kubenswrapper[4763]: I0930 15:07:43.660300 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56961136-1a3f-43ee-a2ea-c69440273759-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:07:43 crc kubenswrapper[4763]: I0930 15:07:43.666157 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56961136-1a3f-43ee-a2ea-c69440273759-kube-api-access-m6srx" (OuterVolumeSpecName: "kube-api-access-m6srx") pod "56961136-1a3f-43ee-a2ea-c69440273759" (UID: "56961136-1a3f-43ee-a2ea-c69440273759"). InnerVolumeSpecName "kube-api-access-m6srx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:07:43 crc kubenswrapper[4763]: I0930 15:07:43.761886 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6srx\" (UniqueName: \"kubernetes.io/projected/56961136-1a3f-43ee-a2ea-c69440273759-kube-api-access-m6srx\") on node \"crc\" DevicePath \"\"" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.500873 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56961136-1a3f-43ee-a2ea-c69440273759" path="/var/lib/kubelet/pods/56961136-1a3f-43ee-a2ea-c69440273759/volumes" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.527765 4763 scope.go:117] "RemoveContainer" containerID="29820bb2fea2a4f2772a6fad3453630131caca36bb8a2fad0115767f294b169c" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.527815 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-j5hmp" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.775156 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-68kkj/crc-debug-kwrfc"] Sep 30 15:07:44 crc kubenswrapper[4763]: E0930 15:07:44.775518 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56961136-1a3f-43ee-a2ea-c69440273759" containerName="container-00" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.775530 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="56961136-1a3f-43ee-a2ea-c69440273759" containerName="container-00" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.775730 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="56961136-1a3f-43ee-a2ea-c69440273759" containerName="container-00" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.776257 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-kwrfc" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.878826 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9c98\" (UniqueName: \"kubernetes.io/projected/8ef783ec-bba9-4bb9-858d-6d07ade31e26-kube-api-access-q9c98\") pod \"crc-debug-kwrfc\" (UID: \"8ef783ec-bba9-4bb9-858d-6d07ade31e26\") " pod="openshift-must-gather-68kkj/crc-debug-kwrfc" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.878886 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ef783ec-bba9-4bb9-858d-6d07ade31e26-host\") pod \"crc-debug-kwrfc\" (UID: \"8ef783ec-bba9-4bb9-858d-6d07ade31e26\") " pod="openshift-must-gather-68kkj/crc-debug-kwrfc" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.980688 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9c98\" (UniqueName: \"kubernetes.io/projected/8ef783ec-bba9-4bb9-858d-6d07ade31e26-kube-api-access-q9c98\") pod \"crc-debug-kwrfc\" (UID: \"8ef783ec-bba9-4bb9-858d-6d07ade31e26\") " pod="openshift-must-gather-68kkj/crc-debug-kwrfc" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.980748 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ef783ec-bba9-4bb9-858d-6d07ade31e26-host\") pod \"crc-debug-kwrfc\" (UID: \"8ef783ec-bba9-4bb9-858d-6d07ade31e26\") " pod="openshift-must-gather-68kkj/crc-debug-kwrfc" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.980910 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ef783ec-bba9-4bb9-858d-6d07ade31e26-host\") pod \"crc-debug-kwrfc\" (UID: \"8ef783ec-bba9-4bb9-858d-6d07ade31e26\") " pod="openshift-must-gather-68kkj/crc-debug-kwrfc" Sep 30 15:07:44 crc kubenswrapper[4763]: I0930 15:07:44.998048 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9c98\" (UniqueName: \"kubernetes.io/projected/8ef783ec-bba9-4bb9-858d-6d07ade31e26-kube-api-access-q9c98\") pod \"crc-debug-kwrfc\" (UID: \"8ef783ec-bba9-4bb9-858d-6d07ade31e26\") " pod="openshift-must-gather-68kkj/crc-debug-kwrfc" Sep 30 15:07:45 crc kubenswrapper[4763]: I0930 15:07:45.103708 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-kwrfc" Sep 30 15:07:45 crc kubenswrapper[4763]: W0930 15:07:45.134385 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ef783ec_bba9_4bb9_858d_6d07ade31e26.slice/crio-d60e2140b6108c2657bcfe741b8d476bcdd24f02276b7fe0687c492c7f91e2f2 WatchSource:0}: Error finding container d60e2140b6108c2657bcfe741b8d476bcdd24f02276b7fe0687c492c7f91e2f2: Status 404 returned error can't find the container with id d60e2140b6108c2657bcfe741b8d476bcdd24f02276b7fe0687c492c7f91e2f2 Sep 30 15:07:45 crc kubenswrapper[4763]: I0930 15:07:45.536915 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/crc-debug-kwrfc" event={"ID":"8ef783ec-bba9-4bb9-858d-6d07ade31e26","Type":"ContainerStarted","Data":"9935cda85c78352ff4f8a59f7a9f890350df51a5df7daed61ffe7ee6cc55287a"} Sep 30 15:07:45 crc kubenswrapper[4763]: I0930 15:07:45.537234 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/crc-debug-kwrfc" event={"ID":"8ef783ec-bba9-4bb9-858d-6d07ade31e26","Type":"ContainerStarted","Data":"d60e2140b6108c2657bcfe741b8d476bcdd24f02276b7fe0687c492c7f91e2f2"} Sep 30 15:07:45 crc kubenswrapper[4763]: I0930 15:07:45.572733 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-68kkj/crc-debug-kwrfc" podStartSLOduration=1.572705398 podStartE2EDuration="1.572705398s" podCreationTimestamp="2025-09-30 15:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:07:45.568163675 +0000 UTC m=+5537.706723970" watchObservedRunningTime="2025-09-30 15:07:45.572705398 +0000 UTC m=+5537.711265683" Sep 30 15:07:46 crc kubenswrapper[4763]: I0930 15:07:46.546397 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ef783ec-bba9-4bb9-858d-6d07ade31e26" containerID="9935cda85c78352ff4f8a59f7a9f890350df51a5df7daed61ffe7ee6cc55287a" exitCode=0 Sep 30 15:07:46 crc kubenswrapper[4763]: I0930 15:07:46.546779 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/crc-debug-kwrfc" event={"ID":"8ef783ec-bba9-4bb9-858d-6d07ade31e26","Type":"ContainerDied","Data":"9935cda85c78352ff4f8a59f7a9f890350df51a5df7daed61ffe7ee6cc55287a"} Sep 30 15:07:47 crc kubenswrapper[4763]: I0930 15:07:47.635924 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-kwrfc" Sep 30 15:07:47 crc kubenswrapper[4763]: I0930 15:07:47.719300 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ef783ec-bba9-4bb9-858d-6d07ade31e26-host\") pod \"8ef783ec-bba9-4bb9-858d-6d07ade31e26\" (UID: \"8ef783ec-bba9-4bb9-858d-6d07ade31e26\") " Sep 30 15:07:47 crc kubenswrapper[4763]: I0930 15:07:47.719369 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9c98\" (UniqueName: \"kubernetes.io/projected/8ef783ec-bba9-4bb9-858d-6d07ade31e26-kube-api-access-q9c98\") pod \"8ef783ec-bba9-4bb9-858d-6d07ade31e26\" (UID: \"8ef783ec-bba9-4bb9-858d-6d07ade31e26\") " Sep 30 15:07:47 crc kubenswrapper[4763]: I0930 15:07:47.719418 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ef783ec-bba9-4bb9-858d-6d07ade31e26-host" (OuterVolumeSpecName: "host") pod "8ef783ec-bba9-4bb9-858d-6d07ade31e26" (UID: "8ef783ec-bba9-4bb9-858d-6d07ade31e26"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:07:47 crc kubenswrapper[4763]: I0930 15:07:47.719788 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ef783ec-bba9-4bb9-858d-6d07ade31e26-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:07:47 crc kubenswrapper[4763]: I0930 15:07:47.727419 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef783ec-bba9-4bb9-858d-6d07ade31e26-kube-api-access-q9c98" (OuterVolumeSpecName: "kube-api-access-q9c98") pod "8ef783ec-bba9-4bb9-858d-6d07ade31e26" (UID: "8ef783ec-bba9-4bb9-858d-6d07ade31e26"). InnerVolumeSpecName "kube-api-access-q9c98". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:07:47 crc kubenswrapper[4763]: I0930 15:07:47.820921 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9c98\" (UniqueName: \"kubernetes.io/projected/8ef783ec-bba9-4bb9-858d-6d07ade31e26-kube-api-access-q9c98\") on node \"crc\" DevicePath \"\"" Sep 30 15:07:48 crc kubenswrapper[4763]: I0930 15:07:48.562406 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/crc-debug-kwrfc" event={"ID":"8ef783ec-bba9-4bb9-858d-6d07ade31e26","Type":"ContainerDied","Data":"d60e2140b6108c2657bcfe741b8d476bcdd24f02276b7fe0687c492c7f91e2f2"} Sep 30 15:07:48 crc kubenswrapper[4763]: I0930 15:07:48.562449 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d60e2140b6108c2657bcfe741b8d476bcdd24f02276b7fe0687c492c7f91e2f2" Sep 30 15:07:48 crc kubenswrapper[4763]: I0930 15:07:48.562503 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-kwrfc" Sep 30 15:07:52 crc kubenswrapper[4763]: I0930 15:07:52.352111 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-68kkj/crc-debug-kwrfc"] Sep 30 15:07:52 crc kubenswrapper[4763]: I0930 15:07:52.359867 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-68kkj/crc-debug-kwrfc"] Sep 30 15:07:52 crc kubenswrapper[4763]: I0930 15:07:52.498529 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef783ec-bba9-4bb9-858d-6d07ade31e26" path="/var/lib/kubelet/pods/8ef783ec-bba9-4bb9-858d-6d07ade31e26/volumes" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.541485 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-68kkj/crc-debug-pw9gx"] Sep 30 15:07:53 crc kubenswrapper[4763]: E0930 15:07:53.542332 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef783ec-bba9-4bb9-858d-6d07ade31e26" containerName="container-00" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.542400 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef783ec-bba9-4bb9-858d-6d07ade31e26" containerName="container-00" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.542642 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef783ec-bba9-4bb9-858d-6d07ade31e26" containerName="container-00" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.543364 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-pw9gx" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.597319 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5714c183-6bf5-4525-97b3-c45315ece4dc-host\") pod \"crc-debug-pw9gx\" (UID: \"5714c183-6bf5-4525-97b3-c45315ece4dc\") " pod="openshift-must-gather-68kkj/crc-debug-pw9gx" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.597367 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxtxn\" (UniqueName: \"kubernetes.io/projected/5714c183-6bf5-4525-97b3-c45315ece4dc-kube-api-access-pxtxn\") pod \"crc-debug-pw9gx\" (UID: \"5714c183-6bf5-4525-97b3-c45315ece4dc\") " pod="openshift-must-gather-68kkj/crc-debug-pw9gx" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.699651 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5714c183-6bf5-4525-97b3-c45315ece4dc-host\") pod \"crc-debug-pw9gx\" (UID: \"5714c183-6bf5-4525-97b3-c45315ece4dc\") " pod="openshift-must-gather-68kkj/crc-debug-pw9gx" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.699701 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxtxn\" (UniqueName: \"kubernetes.io/projected/5714c183-6bf5-4525-97b3-c45315ece4dc-kube-api-access-pxtxn\") pod \"crc-debug-pw9gx\" (UID: \"5714c183-6bf5-4525-97b3-c45315ece4dc\") " pod="openshift-must-gather-68kkj/crc-debug-pw9gx" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.699837 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5714c183-6bf5-4525-97b3-c45315ece4dc-host\") pod \"crc-debug-pw9gx\" (UID: \"5714c183-6bf5-4525-97b3-c45315ece4dc\") " pod="openshift-must-gather-68kkj/crc-debug-pw9gx" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.719710 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxtxn\" (UniqueName: \"kubernetes.io/projected/5714c183-6bf5-4525-97b3-c45315ece4dc-kube-api-access-pxtxn\") pod \"crc-debug-pw9gx\" (UID: \"5714c183-6bf5-4525-97b3-c45315ece4dc\") " pod="openshift-must-gather-68kkj/crc-debug-pw9gx" Sep 30 15:07:53 crc kubenswrapper[4763]: I0930 15:07:53.862027 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-pw9gx" Sep 30 15:07:54 crc kubenswrapper[4763]: I0930 15:07:54.612369 4763 generic.go:334] "Generic (PLEG): container finished" podID="5714c183-6bf5-4525-97b3-c45315ece4dc" containerID="078bd7c13034c6635307db2216249516b4db165ea9ace88c167210472e1384e1" exitCode=0 Sep 30 15:07:54 crc kubenswrapper[4763]: I0930 15:07:54.612483 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/crc-debug-pw9gx" event={"ID":"5714c183-6bf5-4525-97b3-c45315ece4dc","Type":"ContainerDied","Data":"078bd7c13034c6635307db2216249516b4db165ea9ace88c167210472e1384e1"} Sep 30 15:07:54 crc kubenswrapper[4763]: I0930 15:07:54.612663 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/crc-debug-pw9gx" event={"ID":"5714c183-6bf5-4525-97b3-c45315ece4dc","Type":"ContainerStarted","Data":"1845fc13bed7c4b8708a82f9e245d89e81b34e829adfdf73dd196a78bee8ad04"} Sep 30 15:07:54 crc kubenswrapper[4763]: I0930 15:07:54.652230 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-68kkj/crc-debug-pw9gx"] Sep 30 15:07:54 crc kubenswrapper[4763]: I0930 15:07:54.658127 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-68kkj/crc-debug-pw9gx"] Sep 30 15:07:55 crc kubenswrapper[4763]: I0930 15:07:55.704347 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-pw9gx" Sep 30 15:07:55 crc kubenswrapper[4763]: I0930 15:07:55.831259 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5714c183-6bf5-4525-97b3-c45315ece4dc-host\") pod \"5714c183-6bf5-4525-97b3-c45315ece4dc\" (UID: \"5714c183-6bf5-4525-97b3-c45315ece4dc\") " Sep 30 15:07:55 crc kubenswrapper[4763]: I0930 15:07:55.831433 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxtxn\" (UniqueName: \"kubernetes.io/projected/5714c183-6bf5-4525-97b3-c45315ece4dc-kube-api-access-pxtxn\") pod \"5714c183-6bf5-4525-97b3-c45315ece4dc\" (UID: \"5714c183-6bf5-4525-97b3-c45315ece4dc\") " Sep 30 15:07:55 crc kubenswrapper[4763]: I0930 15:07:55.831436 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5714c183-6bf5-4525-97b3-c45315ece4dc-host" (OuterVolumeSpecName: "host") pod "5714c183-6bf5-4525-97b3-c45315ece4dc" (UID: "5714c183-6bf5-4525-97b3-c45315ece4dc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:07:55 crc kubenswrapper[4763]: I0930 15:07:55.831952 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5714c183-6bf5-4525-97b3-c45315ece4dc-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:07:55 crc kubenswrapper[4763]: I0930 15:07:55.845906 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5714c183-6bf5-4525-97b3-c45315ece4dc-kube-api-access-pxtxn" (OuterVolumeSpecName: "kube-api-access-pxtxn") pod "5714c183-6bf5-4525-97b3-c45315ece4dc" (UID: "5714c183-6bf5-4525-97b3-c45315ece4dc"). InnerVolumeSpecName "kube-api-access-pxtxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:07:55 crc kubenswrapper[4763]: I0930 15:07:55.933067 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxtxn\" (UniqueName: \"kubernetes.io/projected/5714c183-6bf5-4525-97b3-c45315ece4dc-kube-api-access-pxtxn\") on node \"crc\" DevicePath \"\"" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.302734 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf_e99c5c15-af06-4688-8573-2faf4351d2d0/util/0.log" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.492341 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf_e99c5c15-af06-4688-8573-2faf4351d2d0/pull/0.log" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.499247 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5714c183-6bf5-4525-97b3-c45315ece4dc" path="/var/lib/kubelet/pods/5714c183-6bf5-4525-97b3-c45315ece4dc/volumes" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.519358 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf_e99c5c15-af06-4688-8573-2faf4351d2d0/pull/0.log" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.535696 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf_e99c5c15-af06-4688-8573-2faf4351d2d0/util/0.log" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.649884 4763 scope.go:117] "RemoveContainer" containerID="078bd7c13034c6635307db2216249516b4db165ea9ace88c167210472e1384e1" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.650089 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/crc-debug-pw9gx" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.744529 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf_e99c5c15-af06-4688-8573-2faf4351d2d0/util/0.log" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.751316 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf_e99c5c15-af06-4688-8573-2faf4351d2d0/pull/0.log" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.769608 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469tprgf_e99c5c15-af06-4688-8573-2faf4351d2d0/extract/0.log" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.902843 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-8s74j_29c17248-6b6c-4ab7-8204-0f5d34a30da5/kube-rbac-proxy/0.log" Sep 30 15:07:56 crc kubenswrapper[4763]: I0930 15:07:56.991193 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-8s74j_29c17248-6b6c-4ab7-8204-0f5d34a30da5/manager/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.007160 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-rpc2r_f38cae68-c345-48f7-9be3-ea9467cb5485/kube-rbac-proxy/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.138817 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-rpc2r_f38cae68-c345-48f7-9be3-ea9467cb5485/manager/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.202317 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-lhnzv_a4cba4a2-dc1b-485e-b141-a4d7f82176ac/kube-rbac-proxy/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.226385 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-lhnzv_a4cba4a2-dc1b-485e-b141-a4d7f82176ac/manager/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.435074 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-vkwpn_008d7fd6-b4bf-44bd-b06f-fc3a8787cb66/kube-rbac-proxy/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.447347 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-vkwpn_008d7fd6-b4bf-44bd-b06f-fc3a8787cb66/manager/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.565341 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-hkxs6_3b13bc76-0bcb-48f3-9e18-f04720087325/kube-rbac-proxy/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.641614 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-hkxs6_3b13bc76-0bcb-48f3-9e18-f04720087325/manager/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.654738 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-8ftj5_ea3d1e11-a06c-4cc4-af77-725fdafb57c8/kube-rbac-proxy/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.766616 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-8ftj5_ea3d1e11-a06c-4cc4-af77-725fdafb57c8/manager/0.log" Sep 30 15:07:57 crc kubenswrapper[4763]: I0930 15:07:57.840638 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c7d9477-g7cnm_ec701ff9-8d7e-43ef-8887-bafe3f09deba/kube-rbac-proxy/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.051268 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c7d9477-g7cnm_ec701ff9-8d7e-43ef-8887-bafe3f09deba/manager/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.082948 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f589bc7f7-qbqvs_8622909b-a085-4553-8bbc-9a33d5f6df74/kube-rbac-proxy/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.084380 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f589bc7f7-qbqvs_8622909b-a085-4553-8bbc-9a33d5f6df74/manager/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.250918 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-2sn8f_1b8d1e87-64b4-4462-a46d-822489fa80f7/kube-rbac-proxy/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.301000 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-2sn8f_1b8d1e87-64b4-4462-a46d-822489fa80f7/manager/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.390585 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-594nq_5d3c4b15-3e62-4fe1-ba6c-37100873dc7e/kube-rbac-proxy/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.425654 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-594nq_5d3c4b15-3e62-4fe1-ba6c-37100873dc7e/manager/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.478229 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-6grkh_7dccabec-591d-4737-8977-1aa8b6fd5907/kube-rbac-proxy/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.607350 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-6grkh_7dccabec-591d-4737-8977-1aa8b6fd5907/manager/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.687304 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b96467f46-vtdfz_08f0bef7-63c6-4118-a5f3-953efc2e638c/kube-rbac-proxy/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.706942 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b96467f46-vtdfz_08f0bef7-63c6-4118-a5f3-953efc2e638c/manager/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.788345 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79f9fc9fd8-22fzq_4addb186-b77b-4a86-85fc-87604ccb3c09/kube-rbac-proxy/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.953344 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79f9fc9fd8-22fzq_4addb186-b77b-4a86-85fc-87604ccb3c09/manager/0.log" Sep 30 15:07:58 crc kubenswrapper[4763]: I0930 15:07:58.986568 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fb7d6b8bf-6mp69_10a71b21-16cb-4064-b639-fbfc2893812a/kube-rbac-proxy/0.log" Sep 30 15:07:59 crc kubenswrapper[4763]: I0930 15:07:59.174584 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fb7d6b8bf-6mp69_10a71b21-16cb-4064-b639-fbfc2893812a/manager/0.log" Sep 30 15:07:59 crc kubenswrapper[4763]: I0930 15:07:59.288210 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds_5acc6630-bf7d-4acf-b724-60e722171e8f/kube-rbac-proxy/0.log" Sep 30 15:07:59 crc kubenswrapper[4763]: I0930 15:07:59.370407 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86b7cb4c5fx74ds_5acc6630-bf7d-4acf-b724-60e722171e8f/manager/0.log" Sep 30 15:07:59 crc kubenswrapper[4763]: I0930 15:07:59.478186 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b7bb8bd67-m2cnh_d8b604b2-49b7-4471-9e02-161a0caebc4b/kube-rbac-proxy/0.log" Sep 30 15:07:59 crc kubenswrapper[4763]: I0930 15:07:59.686539 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56dc567787-vbrll_db30003e-6ca7-40a4-a3cf-9487f505109b/kube-rbac-proxy/0.log" Sep 30 15:07:59 crc kubenswrapper[4763]: I0930 15:07:59.944301 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56dc567787-vbrll_db30003e-6ca7-40a4-a3cf-9487f505109b/operator/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.034037 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-9bbxd_203eb617-841b-4e63-8258-10ddfde53da0/registry-server/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.081780 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-r5sn8_55edd3bf-c291-4659-a6dc-1c348d04799c/kube-rbac-proxy/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.279448 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-r5sn8_55edd3bf-c291-4659-a6dc-1c348d04799c/manager/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.334326 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-jkggm_5bf8cc8d-90d0-4687-bfca-fd75f8d1c308/kube-rbac-proxy/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.389270 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-jkggm_5bf8cc8d-90d0-4687-bfca-fd75f8d1c308/manager/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.461323 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b7bb8bd67-m2cnh_d8b604b2-49b7-4471-9e02-161a0caebc4b/manager/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.493279 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-fdr5r_6efd5b9a-3e7d-4913-930d-3fe4452551b6/operator/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.600126 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-657c6b68c7-wnh7g_6a1bd649-6042-4f29-b6ab-cb3bcfcdca51/kube-rbac-proxy/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.666777 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-657c6b68c7-wnh7g_6a1bd649-6042-4f29-b6ab-cb3bcfcdca51/manager/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.755970 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-p95zl_14beb357-7d8b-4cbd-bda6-56eddcd765b0/kube-rbac-proxy/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.802480 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-p95zl_14beb357-7d8b-4cbd-bda6-56eddcd765b0/manager/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.919080 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb97fcf96-v8f7j_02b9b96d-a908-4964-ac52-b5b8f73ffbef/kube-rbac-proxy/0.log" Sep 30 15:08:00 crc kubenswrapper[4763]: I0930 15:08:00.987153 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb97fcf96-v8f7j_02b9b96d-a908-4964-ac52-b5b8f73ffbef/manager/0.log" Sep 30 15:08:01 crc kubenswrapper[4763]: I0930 15:08:01.050114 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75756dd4d9-72d45_9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd/kube-rbac-proxy/0.log" Sep 30 15:08:01 crc kubenswrapper[4763]: I0930 15:08:01.078428 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75756dd4d9-72d45_9c68c2c4-d0be-4bf4-a83c-975d1eb9a1dd/manager/0.log" Sep 30 15:08:06 crc kubenswrapper[4763]: I0930 15:08:06.059973 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:08:06 crc kubenswrapper[4763]: I0930 15:08:06.060563 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:08:15 crc kubenswrapper[4763]: I0930 15:08:15.742416 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ms8hj_b20ac013-c0be-4b7a-b5a8-cd6db89814ee/control-plane-machine-set-operator/0.log" Sep 30 15:08:15 crc kubenswrapper[4763]: I0930 15:08:15.927829 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jqx2k_817d1626-d4a3-4df7-bbbd-0ae698936819/machine-api-operator/0.log" Sep 30 15:08:15 crc kubenswrapper[4763]: I0930 15:08:15.933860 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jqx2k_817d1626-d4a3-4df7-bbbd-0ae698936819/kube-rbac-proxy/0.log" Sep 30 15:08:26 crc kubenswrapper[4763]: I0930 15:08:26.748232 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-9kkpz_88dbbf99-11f5-4a82-95d8-a0c0e97d7d76/cert-manager-controller/0.log" Sep 30 15:08:26 crc kubenswrapper[4763]: I0930 15:08:26.821935 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-gt2hz_53d7cd58-3d29-4172-8ecf-f7117a00e79f/cert-manager-cainjector/0.log" Sep 30 15:08:26 crc kubenswrapper[4763]: I0930 15:08:26.911515 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-prxdp_d74685c2-f75e-4c1e-84e7-63c7bc19f221/cert-manager-webhook/0.log" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.806681 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bndxz"] Sep 30 15:08:33 crc kubenswrapper[4763]: E0930 15:08:33.807497 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5714c183-6bf5-4525-97b3-c45315ece4dc" containerName="container-00" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.807509 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5714c183-6bf5-4525-97b3-c45315ece4dc" containerName="container-00" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.807677 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5714c183-6bf5-4525-97b3-c45315ece4dc" containerName="container-00" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.808965 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.834172 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8ldf\" (UniqueName: \"kubernetes.io/projected/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-kube-api-access-r8ldf\") pod \"community-operators-bndxz\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.834250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-catalog-content\") pod \"community-operators-bndxz\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.834329 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-utilities\") pod \"community-operators-bndxz\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.836014 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bndxz"] Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.935879 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8ldf\" (UniqueName: \"kubernetes.io/projected/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-kube-api-access-r8ldf\") pod \"community-operators-bndxz\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.935975 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-catalog-content\") pod \"community-operators-bndxz\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.936008 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-utilities\") pod \"community-operators-bndxz\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.936729 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-catalog-content\") pod \"community-operators-bndxz\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.936740 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-utilities\") pod \"community-operators-bndxz\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:33 crc kubenswrapper[4763]: I0930 15:08:33.964594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8ldf\" (UniqueName: \"kubernetes.io/projected/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-kube-api-access-r8ldf\") pod \"community-operators-bndxz\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:34 crc kubenswrapper[4763]: I0930 15:08:34.126430 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:34 crc kubenswrapper[4763]: I0930 15:08:34.607229 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bndxz"] Sep 30 15:08:34 crc kubenswrapper[4763]: I0930 15:08:34.804229 4763 scope.go:117] "RemoveContainer" containerID="b6475298ad08bc581075c127ce07a3ce2facddc3228d965287013961ed5c4ace" Sep 30 15:08:34 crc kubenswrapper[4763]: I0930 15:08:34.954533 4763 generic.go:334] "Generic (PLEG): container finished" podID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerID="cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae" exitCode=0 Sep 30 15:08:34 crc kubenswrapper[4763]: I0930 15:08:34.954575 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bndxz" event={"ID":"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c","Type":"ContainerDied","Data":"cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae"} Sep 30 15:08:34 crc kubenswrapper[4763]: I0930 15:08:34.954867 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bndxz" event={"ID":"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c","Type":"ContainerStarted","Data":"aa2bd62b8a5bec840698c61ad9d4721879a26126627c03cc2f2c8304ac163620"} Sep 30 15:08:35 crc kubenswrapper[4763]: I0930 15:08:35.964094 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bndxz" event={"ID":"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c","Type":"ContainerStarted","Data":"dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85"} Sep 30 15:08:36 crc kubenswrapper[4763]: I0930 15:08:36.060058 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:08:36 crc kubenswrapper[4763]: I0930 15:08:36.060119 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:08:36 crc kubenswrapper[4763]: I0930 15:08:36.973326 4763 generic.go:334] "Generic (PLEG): container finished" podID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerID="dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85" exitCode=0 Sep 30 15:08:36 crc kubenswrapper[4763]: I0930 15:08:36.973392 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bndxz" event={"ID":"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c","Type":"ContainerDied","Data":"dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85"} Sep 30 15:08:37 crc kubenswrapper[4763]: I0930 15:08:37.684529 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-85wg7_04c0360a-87b1-434f-8d7b-9aadd2e5ab33/nmstate-console-plugin/0.log" Sep 30 15:08:37 crc kubenswrapper[4763]: I0930 15:08:37.878166 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-d8g2c_535ccdf4-0560-4eb1-bfc6-8135453e4e11/nmstate-handler/0.log" Sep 30 15:08:37 crc kubenswrapper[4763]: I0930 15:08:37.964446 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-7zvfk_8f52d94c-384b-4cbe-ac9d-aeffdb2769bb/kube-rbac-proxy/0.log" Sep 30 15:08:37 crc kubenswrapper[4763]: I0930 15:08:37.983237 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bndxz" event={"ID":"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c","Type":"ContainerStarted","Data":"264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed"} Sep 30 15:08:38 crc kubenswrapper[4763]: I0930 15:08:38.006421 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-7zvfk_8f52d94c-384b-4cbe-ac9d-aeffdb2769bb/nmstate-metrics/0.log" Sep 30 15:08:38 crc kubenswrapper[4763]: I0930 15:08:38.008021 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bndxz" podStartSLOduration=2.434472979 podStartE2EDuration="5.008004157s" podCreationTimestamp="2025-09-30 15:08:33 +0000 UTC" firstStartedPulling="2025-09-30 15:08:34.956014897 +0000 UTC m=+5587.094575182" lastFinishedPulling="2025-09-30 15:08:37.529546075 +0000 UTC m=+5589.668106360" observedRunningTime="2025-09-30 15:08:38.001961797 +0000 UTC m=+5590.140522102" watchObservedRunningTime="2025-09-30 15:08:38.008004157 +0000 UTC m=+5590.146564442" Sep 30 15:08:38 crc kubenswrapper[4763]: I0930 15:08:38.209406 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-dqq6n_668f7b93-4e0d-4344-b856-1507f347c5a1/nmstate-webhook/0.log" Sep 30 15:08:38 crc kubenswrapper[4763]: I0930 15:08:38.211387 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-gdkg2_099f1320-84d1-45bd-a71b-36248dadb714/nmstate-operator/0.log" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.624712 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-djj6s"] Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.626823 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.646934 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-djj6s"] Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.782558 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-catalog-content\") pod \"certified-operators-djj6s\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.782872 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-utilities\") pod \"certified-operators-djj6s\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.782917 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jf2k\" (UniqueName: \"kubernetes.io/projected/efb83c05-8409-40c0-922b-0453b881bd65-kube-api-access-9jf2k\") pod \"certified-operators-djj6s\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.883478 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-catalog-content\") pod \"certified-operators-djj6s\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.883539 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-utilities\") pod \"certified-operators-djj6s\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.883563 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jf2k\" (UniqueName: \"kubernetes.io/projected/efb83c05-8409-40c0-922b-0453b881bd65-kube-api-access-9jf2k\") pod \"certified-operators-djj6s\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.884002 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-catalog-content\") pod \"certified-operators-djj6s\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.884131 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-utilities\") pod \"certified-operators-djj6s\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.909557 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jf2k\" (UniqueName: \"kubernetes.io/projected/efb83c05-8409-40c0-922b-0453b881bd65-kube-api-access-9jf2k\") pod \"certified-operators-djj6s\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:42 crc kubenswrapper[4763]: I0930 15:08:42.947993 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:43 crc kubenswrapper[4763]: I0930 15:08:43.586473 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-djj6s"] Sep 30 15:08:44 crc kubenswrapper[4763]: I0930 15:08:44.037075 4763 generic.go:334] "Generic (PLEG): container finished" podID="efb83c05-8409-40c0-922b-0453b881bd65" containerID="1e2b460990e5cf728da9f204793be69045ab8bdc08b04998cde5123a306b18f1" exitCode=0 Sep 30 15:08:44 crc kubenswrapper[4763]: I0930 15:08:44.037379 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djj6s" event={"ID":"efb83c05-8409-40c0-922b-0453b881bd65","Type":"ContainerDied","Data":"1e2b460990e5cf728da9f204793be69045ab8bdc08b04998cde5123a306b18f1"} Sep 30 15:08:44 crc kubenswrapper[4763]: I0930 15:08:44.037410 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djj6s" event={"ID":"efb83c05-8409-40c0-922b-0453b881bd65","Type":"ContainerStarted","Data":"28dfb0c0b71f499827dfd4291cf3cb5a4bb822063c46550195dfa6aced31c984"} Sep 30 15:08:44 crc kubenswrapper[4763]: I0930 15:08:44.127490 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:44 crc kubenswrapper[4763]: I0930 15:08:44.127848 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:44 crc kubenswrapper[4763]: I0930 15:08:44.171195 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:45 crc kubenswrapper[4763]: I0930 15:08:45.049473 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djj6s" event={"ID":"efb83c05-8409-40c0-922b-0453b881bd65","Type":"ContainerStarted","Data":"c7f2beec924e4f99cc1b8dafb99ec370d6d16320f80cc27f87086e47102791a9"} Sep 30 15:08:45 crc kubenswrapper[4763]: I0930 15:08:45.120953 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:46 crc kubenswrapper[4763]: I0930 15:08:46.062156 4763 generic.go:334] "Generic (PLEG): container finished" podID="efb83c05-8409-40c0-922b-0453b881bd65" containerID="c7f2beec924e4f99cc1b8dafb99ec370d6d16320f80cc27f87086e47102791a9" exitCode=0 Sep 30 15:08:46 crc kubenswrapper[4763]: I0930 15:08:46.062845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djj6s" event={"ID":"efb83c05-8409-40c0-922b-0453b881bd65","Type":"ContainerDied","Data":"c7f2beec924e4f99cc1b8dafb99ec370d6d16320f80cc27f87086e47102791a9"} Sep 30 15:08:46 crc kubenswrapper[4763]: I0930 15:08:46.403400 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bndxz"] Sep 30 15:08:47 crc kubenswrapper[4763]: I0930 15:08:47.071910 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djj6s" event={"ID":"efb83c05-8409-40c0-922b-0453b881bd65","Type":"ContainerStarted","Data":"8509da95501e3b0996fe6a2beb618c3d009779f6c1a69d3a2f2d90d7fd8d0cf2"} Sep 30 15:08:47 crc kubenswrapper[4763]: I0930 15:08:47.091212 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-djj6s" podStartSLOduration=2.430861118 podStartE2EDuration="5.091189221s" podCreationTimestamp="2025-09-30 15:08:42 +0000 UTC" firstStartedPulling="2025-09-30 15:08:44.041093369 +0000 UTC m=+5596.179653664" lastFinishedPulling="2025-09-30 15:08:46.701421482 +0000 UTC m=+5598.839981767" observedRunningTime="2025-09-30 15:08:47.088583956 +0000 UTC m=+5599.227144251" watchObservedRunningTime="2025-09-30 15:08:47.091189221 +0000 UTC m=+5599.229749506" Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.078496 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bndxz" podUID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerName="registry-server" containerID="cri-o://264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed" gracePeriod=2 Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.481626 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.577862 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-catalog-content\") pod \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.577960 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8ldf\" (UniqueName: \"kubernetes.io/projected/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-kube-api-access-r8ldf\") pod \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.578000 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-utilities\") pod \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\" (UID: \"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c\") " Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.578955 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-utilities" (OuterVolumeSpecName: "utilities") pod "89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" (UID: "89bc7891-fe56-4c82-9c4e-5baf3ac7b13c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.584336 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-kube-api-access-r8ldf" (OuterVolumeSpecName: "kube-api-access-r8ldf") pod "89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" (UID: "89bc7891-fe56-4c82-9c4e-5baf3ac7b13c"). InnerVolumeSpecName "kube-api-access-r8ldf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.635112 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" (UID: "89bc7891-fe56-4c82-9c4e-5baf3ac7b13c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.680245 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.680284 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8ldf\" (UniqueName: \"kubernetes.io/projected/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-kube-api-access-r8ldf\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:48 crc kubenswrapper[4763]: I0930 15:08:48.680314 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.089590 4763 generic.go:334] "Generic (PLEG): container finished" podID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerID="264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed" exitCode=0 Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.089664 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bndxz" event={"ID":"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c","Type":"ContainerDied","Data":"264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed"} Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.089664 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bndxz" Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.089709 4763 scope.go:117] "RemoveContainer" containerID="264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed" Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.089694 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bndxz" event={"ID":"89bc7891-fe56-4c82-9c4e-5baf3ac7b13c","Type":"ContainerDied","Data":"aa2bd62b8a5bec840698c61ad9d4721879a26126627c03cc2f2c8304ac163620"} Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.118834 4763 scope.go:117] "RemoveContainer" containerID="dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85" Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.142985 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bndxz"] Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.143918 4763 scope.go:117] "RemoveContainer" containerID="cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae" Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.157892 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bndxz"] Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.192315 4763 scope.go:117] "RemoveContainer" containerID="264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed" Sep 30 15:08:49 crc kubenswrapper[4763]: E0930 15:08:49.193620 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed\": container with ID starting with 264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed not found: ID does not exist" containerID="264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed" Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.193669 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed"} err="failed to get container status \"264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed\": rpc error: code = NotFound desc = could not find container \"264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed\": container with ID starting with 264891bc5cc3628a6a58541d292309571076bf292f1fff45b692ee26df4ae8ed not found: ID does not exist" Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.193700 4763 scope.go:117] "RemoveContainer" containerID="dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85" Sep 30 15:08:49 crc kubenswrapper[4763]: E0930 15:08:49.196256 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85\": container with ID starting with dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85 not found: ID does not exist" containerID="dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85" Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.196416 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85"} err="failed to get container status \"dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85\": rpc error: code = NotFound desc = could not find container \"dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85\": container with ID starting with dae15ce2d43d756d91d8b1a86e24045ed81d66a47a7ad1ccc1b132ec0fc6ca85 not found: ID does not exist" Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.196566 4763 scope.go:117] "RemoveContainer" containerID="cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae" Sep 30 15:08:49 crc kubenswrapper[4763]: E0930 15:08:49.197022 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae\": container with ID starting with cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae not found: ID does not exist" containerID="cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae" Sep 30 15:08:49 crc kubenswrapper[4763]: I0930 15:08:49.197130 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae"} err="failed to get container status \"cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae\": rpc error: code = NotFound desc = could not find container \"cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae\": container with ID starting with cbbdcf014cb1cfb512b15cc3f0b27d73ea5ddce9da0a4899ae31f944a867abae not found: ID does not exist" Sep 30 15:08:50 crc kubenswrapper[4763]: I0930 15:08:50.501693 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" path="/var/lib/kubelet/pods/89bc7891-fe56-4c82-9c4e-5baf3ac7b13c/volumes" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.195080 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-nzrdj_8f6f9f19-81cf-4593-8a84-7f1d771d4aa1/kube-rbac-proxy/0.log" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.368152 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-frr-files/0.log" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.570996 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-nzrdj_8f6f9f19-81cf-4593-8a84-7f1d771d4aa1/controller/0.log" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.606864 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-frr-files/0.log" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.620537 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-reloader/0.log" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.667501 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-metrics/0.log" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.754776 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-reloader/0.log" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.918296 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-frr-files/0.log" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.931262 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-metrics/0.log" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.937961 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-metrics/0.log" Sep 30 15:08:51 crc kubenswrapper[4763]: I0930 15:08:51.945913 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-reloader/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.152501 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-frr-files/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.156897 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-metrics/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.203975 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/controller/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.228504 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/cp-reloader/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.398420 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/kube-rbac-proxy/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.399154 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/frr-metrics/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.418273 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/kube-rbac-proxy-frr/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.647332 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-bd98m_9626782e-2d58-46e5-b064-7cc2fcb72381/frr-k8s-webhook-server/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.663305 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/reloader/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.844701 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-68dc9cffd9-8prvz_06af70e6-df4e-4f6b-a52a-0cc90fc0dfe6/manager/0.log" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.948832 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:52 crc kubenswrapper[4763]: I0930 15:08:52.948875 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:53 crc kubenswrapper[4763]: I0930 15:08:53.000277 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:53 crc kubenswrapper[4763]: I0930 15:08:53.081316 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-94474c8f7-zxvpq_7a15a621-d993-4cee-a58d-dbf5b4361ede/webhook-server/0.log" Sep 30 15:08:53 crc kubenswrapper[4763]: I0930 15:08:53.167020 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:53 crc kubenswrapper[4763]: I0930 15:08:53.186403 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-c6gdx_f7308df6-1fa3-4459-a108-5151e3b927fd/kube-rbac-proxy/0.log" Sep 30 15:08:53 crc kubenswrapper[4763]: I0930 15:08:53.403813 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-djj6s"] Sep 30 15:08:53 crc kubenswrapper[4763]: I0930 15:08:53.858827 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-c6gdx_f7308df6-1fa3-4459-a108-5151e3b927fd/speaker/0.log" Sep 30 15:08:54 crc kubenswrapper[4763]: I0930 15:08:54.109880 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8jngk_d761c91a-ac32-4853-9045-0a8fb9df18c6/frr/0.log" Sep 30 15:08:55 crc kubenswrapper[4763]: I0930 15:08:55.134620 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-djj6s" podUID="efb83c05-8409-40c0-922b-0453b881bd65" containerName="registry-server" containerID="cri-o://8509da95501e3b0996fe6a2beb618c3d009779f6c1a69d3a2f2d90d7fd8d0cf2" gracePeriod=2 Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.147014 4763 generic.go:334] "Generic (PLEG): container finished" podID="efb83c05-8409-40c0-922b-0453b881bd65" containerID="8509da95501e3b0996fe6a2beb618c3d009779f6c1a69d3a2f2d90d7fd8d0cf2" exitCode=0 Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.147425 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djj6s" event={"ID":"efb83c05-8409-40c0-922b-0453b881bd65","Type":"ContainerDied","Data":"8509da95501e3b0996fe6a2beb618c3d009779f6c1a69d3a2f2d90d7fd8d0cf2"} Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.283991 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.412330 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-catalog-content\") pod \"efb83c05-8409-40c0-922b-0453b881bd65\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.412445 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jf2k\" (UniqueName: \"kubernetes.io/projected/efb83c05-8409-40c0-922b-0453b881bd65-kube-api-access-9jf2k\") pod \"efb83c05-8409-40c0-922b-0453b881bd65\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.413844 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-utilities\") pod \"efb83c05-8409-40c0-922b-0453b881bd65\" (UID: \"efb83c05-8409-40c0-922b-0453b881bd65\") " Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.414682 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-utilities" (OuterVolumeSpecName: "utilities") pod "efb83c05-8409-40c0-922b-0453b881bd65" (UID: "efb83c05-8409-40c0-922b-0453b881bd65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.420712 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb83c05-8409-40c0-922b-0453b881bd65-kube-api-access-9jf2k" (OuterVolumeSpecName: "kube-api-access-9jf2k") pod "efb83c05-8409-40c0-922b-0453b881bd65" (UID: "efb83c05-8409-40c0-922b-0453b881bd65"). InnerVolumeSpecName "kube-api-access-9jf2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.467000 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efb83c05-8409-40c0-922b-0453b881bd65" (UID: "efb83c05-8409-40c0-922b-0453b881bd65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.516285 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.516349 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb83c05-8409-40c0-922b-0453b881bd65-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:56 crc kubenswrapper[4763]: I0930 15:08:56.516363 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jf2k\" (UniqueName: \"kubernetes.io/projected/efb83c05-8409-40c0-922b-0453b881bd65-kube-api-access-9jf2k\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:57 crc kubenswrapper[4763]: I0930 15:08:57.161563 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djj6s" event={"ID":"efb83c05-8409-40c0-922b-0453b881bd65","Type":"ContainerDied","Data":"28dfb0c0b71f499827dfd4291cf3cb5a4bb822063c46550195dfa6aced31c984"} Sep 30 15:08:57 crc kubenswrapper[4763]: I0930 15:08:57.161641 4763 scope.go:117] "RemoveContainer" containerID="8509da95501e3b0996fe6a2beb618c3d009779f6c1a69d3a2f2d90d7fd8d0cf2" Sep 30 15:08:57 crc kubenswrapper[4763]: I0930 15:08:57.161726 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djj6s" Sep 30 15:08:57 crc kubenswrapper[4763]: I0930 15:08:57.185705 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-djj6s"] Sep 30 15:08:57 crc kubenswrapper[4763]: I0930 15:08:57.191864 4763 scope.go:117] "RemoveContainer" containerID="c7f2beec924e4f99cc1b8dafb99ec370d6d16320f80cc27f87086e47102791a9" Sep 30 15:08:57 crc kubenswrapper[4763]: I0930 15:08:57.204268 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-djj6s"] Sep 30 15:08:57 crc kubenswrapper[4763]: I0930 15:08:57.224909 4763 scope.go:117] "RemoveContainer" containerID="1e2b460990e5cf728da9f204793be69045ab8bdc08b04998cde5123a306b18f1" Sep 30 15:08:58 crc kubenswrapper[4763]: I0930 15:08:58.502533 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb83c05-8409-40c0-922b-0453b881bd65" path="/var/lib/kubelet/pods/efb83c05-8409-40c0-922b-0453b881bd65/volumes" Sep 30 15:09:04 crc kubenswrapper[4763]: I0930 15:09:04.488166 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h_42bb402e-356c-4746-ad86-991264de21e7/util/0.log" Sep 30 15:09:04 crc kubenswrapper[4763]: I0930 15:09:04.674623 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h_42bb402e-356c-4746-ad86-991264de21e7/util/0.log" Sep 30 15:09:04 crc kubenswrapper[4763]: I0930 15:09:04.735118 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h_42bb402e-356c-4746-ad86-991264de21e7/pull/0.log" Sep 30 15:09:04 crc kubenswrapper[4763]: I0930 15:09:04.744429 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h_42bb402e-356c-4746-ad86-991264de21e7/pull/0.log" Sep 30 15:09:04 crc kubenswrapper[4763]: I0930 15:09:04.878280 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h_42bb402e-356c-4746-ad86-991264de21e7/util/0.log" Sep 30 15:09:04 crc kubenswrapper[4763]: I0930 15:09:04.884644 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h_42bb402e-356c-4746-ad86-991264de21e7/pull/0.log" Sep 30 15:09:04 crc kubenswrapper[4763]: I0930 15:09:04.914314 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb6997k9h_42bb402e-356c-4746-ad86-991264de21e7/extract/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.044645 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87_bf82c0dd-1274-44d1-ac55-d1e2278de472/util/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.184669 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87_bf82c0dd-1274-44d1-ac55-d1e2278de472/pull/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.192653 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87_bf82c0dd-1274-44d1-ac55-d1e2278de472/util/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.202757 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87_bf82c0dd-1274-44d1-ac55-d1e2278de472/pull/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.342940 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87_bf82c0dd-1274-44d1-ac55-d1e2278de472/util/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.393223 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87_bf82c0dd-1274-44d1-ac55-d1e2278de472/pull/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.408928 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc59m87_bf82c0dd-1274-44d1-ac55-d1e2278de472/extract/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.538119 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nfg4_a8295f8d-50ee-49f5-890a-77e5bb976ce4/extract-utilities/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.689433 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nfg4_a8295f8d-50ee-49f5-890a-77e5bb976ce4/extract-content/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.724251 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nfg4_a8295f8d-50ee-49f5-890a-77e5bb976ce4/extract-content/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.730531 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nfg4_a8295f8d-50ee-49f5-890a-77e5bb976ce4/extract-utilities/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.855698 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nfg4_a8295f8d-50ee-49f5-890a-77e5bb976ce4/extract-utilities/0.log" Sep 30 15:09:05 crc kubenswrapper[4763]: I0930 15:09:05.873040 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nfg4_a8295f8d-50ee-49f5-890a-77e5bb976ce4/extract-content/0.log" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.059263 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.059319 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.059373 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.060117 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.060188 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" gracePeriod=600 Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.139326 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d987l_848ce3c4-a1ee-48f6-a387-341571572384/extract-utilities/0.log" Sep 30 15:09:06 crc kubenswrapper[4763]: E0930 15:09:06.193029 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.230985 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" exitCode=0 Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.231029 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44"} Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.231067 4763 scope.go:117] "RemoveContainer" containerID="250f87b86eee2fc155a67416ec0df43385031426a4b801bb824303cda0afd8fa" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.231663 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:09:06 crc kubenswrapper[4763]: E0930 15:09:06.231866 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.286315 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d987l_848ce3c4-a1ee-48f6-a387-341571572384/extract-content/0.log" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.308643 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d987l_848ce3c4-a1ee-48f6-a387-341571572384/extract-utilities/0.log" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.386056 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d987l_848ce3c4-a1ee-48f6-a387-341571572384/extract-content/0.log" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.579502 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d987l_848ce3c4-a1ee-48f6-a387-341571572384/extract-content/0.log" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.596762 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d987l_848ce3c4-a1ee-48f6-a387-341571572384/extract-utilities/0.log" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.823714 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nfg4_a8295f8d-50ee-49f5-890a-77e5bb976ce4/registry-server/0.log" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.890222 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h_9410b485-e90e-4cb8-a924-d82596f1efee/util/0.log" Sep 30 15:09:06 crc kubenswrapper[4763]: I0930 15:09:06.921080 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d987l_848ce3c4-a1ee-48f6-a387-341571572384/registry-server/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.050955 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h_9410b485-e90e-4cb8-a924-d82596f1efee/pull/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.077323 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h_9410b485-e90e-4cb8-a924-d82596f1efee/util/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.083222 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h_9410b485-e90e-4cb8-a924-d82596f1efee/pull/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.393906 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h_9410b485-e90e-4cb8-a924-d82596f1efee/pull/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.418647 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h_9410b485-e90e-4cb8-a924-d82596f1efee/util/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.425516 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96njs6h_9410b485-e90e-4cb8-a924-d82596f1efee/extract/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.576402 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-58tml_50a4b247-74a5-4ceb-a32c-c92fce4f11b2/marketplace-operator/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.622478 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9j8qz_6bab0204-e2f4-4666-a525-ce8b8cea5f17/extract-utilities/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.804295 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9j8qz_6bab0204-e2f4-4666-a525-ce8b8cea5f17/extract-content/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.805283 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9j8qz_6bab0204-e2f4-4666-a525-ce8b8cea5f17/extract-utilities/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.806821 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9j8qz_6bab0204-e2f4-4666-a525-ce8b8cea5f17/extract-content/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.963501 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9j8qz_6bab0204-e2f4-4666-a525-ce8b8cea5f17/extract-content/0.log" Sep 30 15:09:07 crc kubenswrapper[4763]: I0930 15:09:07.971963 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9j8qz_6bab0204-e2f4-4666-a525-ce8b8cea5f17/extract-utilities/0.log" Sep 30 15:09:08 crc kubenswrapper[4763]: I0930 15:09:08.054128 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mp4wr_b620c7c8-bb53-44f3-8723-828bf69bb55e/extract-utilities/0.log" Sep 30 15:09:08 crc kubenswrapper[4763]: I0930 15:09:08.168046 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9j8qz_6bab0204-e2f4-4666-a525-ce8b8cea5f17/registry-server/0.log" Sep 30 15:09:08 crc kubenswrapper[4763]: I0930 15:09:08.230152 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mp4wr_b620c7c8-bb53-44f3-8723-828bf69bb55e/extract-utilities/0.log" Sep 30 15:09:08 crc kubenswrapper[4763]: I0930 15:09:08.288548 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mp4wr_b620c7c8-bb53-44f3-8723-828bf69bb55e/extract-content/0.log" Sep 30 15:09:08 crc kubenswrapper[4763]: I0930 15:09:08.312073 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mp4wr_b620c7c8-bb53-44f3-8723-828bf69bb55e/extract-content/0.log" Sep 30 15:09:08 crc kubenswrapper[4763]: I0930 15:09:08.435618 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mp4wr_b620c7c8-bb53-44f3-8723-828bf69bb55e/extract-content/0.log" Sep 30 15:09:08 crc kubenswrapper[4763]: I0930 15:09:08.436023 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mp4wr_b620c7c8-bb53-44f3-8723-828bf69bb55e/extract-utilities/0.log" Sep 30 15:09:08 crc kubenswrapper[4763]: I0930 15:09:08.842652 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mp4wr_b620c7c8-bb53-44f3-8723-828bf69bb55e/registry-server/0.log" Sep 30 15:09:19 crc kubenswrapper[4763]: I0930 15:09:19.489034 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:09:19 crc kubenswrapper[4763]: E0930 15:09:19.489654 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:09:31 crc kubenswrapper[4763]: I0930 15:09:31.489904 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:09:31 crc kubenswrapper[4763]: E0930 15:09:31.490619 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:09:42 crc kubenswrapper[4763]: I0930 15:09:42.489513 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:09:42 crc kubenswrapper[4763]: E0930 15:09:42.490302 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:09:53 crc kubenswrapper[4763]: I0930 15:09:53.489328 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:09:53 crc kubenswrapper[4763]: E0930 15:09:53.491756 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:10:04 crc kubenswrapper[4763]: I0930 15:10:04.489807 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:10:04 crc kubenswrapper[4763]: E0930 15:10:04.490512 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:10:15 crc kubenswrapper[4763]: I0930 15:10:15.489763 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:10:15 crc kubenswrapper[4763]: E0930 15:10:15.490477 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:10:28 crc kubenswrapper[4763]: I0930 15:10:28.496382 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:10:28 crc kubenswrapper[4763]: E0930 15:10:28.498809 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:10:42 crc kubenswrapper[4763]: I0930 15:10:42.489407 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:10:42 crc kubenswrapper[4763]: E0930 15:10:42.490175 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:10:57 crc kubenswrapper[4763]: I0930 15:10:57.489813 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:10:57 crc kubenswrapper[4763]: E0930 15:10:57.490669 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:11:12 crc kubenswrapper[4763]: I0930 15:11:12.489801 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:11:12 crc kubenswrapper[4763]: E0930 15:11:12.490643 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:11:27 crc kubenswrapper[4763]: I0930 15:11:27.489751 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:11:27 crc kubenswrapper[4763]: E0930 15:11:27.490453 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:11:41 crc kubenswrapper[4763]: I0930 15:11:41.489368 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:11:41 crc kubenswrapper[4763]: E0930 15:11:41.490286 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:11:52 crc kubenswrapper[4763]: I0930 15:11:52.489869 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:11:52 crc kubenswrapper[4763]: E0930 15:11:52.490727 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:12:04 crc kubenswrapper[4763]: I0930 15:12:04.489749 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:12:04 crc kubenswrapper[4763]: E0930 15:12:04.492434 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:12:11 crc kubenswrapper[4763]: I0930 15:12:11.078658 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-hlsjx"] Sep 30 15:12:11 crc kubenswrapper[4763]: I0930 15:12:11.082458 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-hlsjx"] Sep 30 15:12:12 crc kubenswrapper[4763]: I0930 15:12:12.500392 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be97e602-d2d5-4dde-b3fc-3f013fabf8fc" path="/var/lib/kubelet/pods/be97e602-d2d5-4dde-b3fc-3f013fabf8fc/volumes" Sep 30 15:12:18 crc kubenswrapper[4763]: I0930 15:12:18.503730 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:12:18 crc kubenswrapper[4763]: E0930 15:12:18.504662 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:12:23 crc kubenswrapper[4763]: I0930 15:12:23.044637 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-33f9-account-create-x9dxr"] Sep 30 15:12:23 crc kubenswrapper[4763]: I0930 15:12:23.060431 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-33f9-account-create-x9dxr"] Sep 30 15:12:24 crc kubenswrapper[4763]: I0930 15:12:24.510528 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6579aae-8365-4154-bc8b-c5af7b342ebb" path="/var/lib/kubelet/pods/e6579aae-8365-4154-bc8b-c5af7b342ebb/volumes" Sep 30 15:12:29 crc kubenswrapper[4763]: I0930 15:12:29.068550 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wqkxw"] Sep 30 15:12:29 crc kubenswrapper[4763]: I0930 15:12:29.075244 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wqkxw"] Sep 30 15:12:30 crc kubenswrapper[4763]: I0930 15:12:30.509052 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae0d63a0-c59a-4eb2-ae24-b65200d012b8" path="/var/lib/kubelet/pods/ae0d63a0-c59a-4eb2-ae24-b65200d012b8/volumes" Sep 30 15:12:33 crc kubenswrapper[4763]: I0930 15:12:33.490224 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:12:33 crc kubenswrapper[4763]: E0930 15:12:33.492704 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:12:34 crc kubenswrapper[4763]: I0930 15:12:34.977518 4763 scope.go:117] "RemoveContainer" containerID="2037c6e333a957fe66075dc6c0a96844cde6ca1b7eafd9b094ed58e953b44cbb" Sep 30 15:12:35 crc kubenswrapper[4763]: I0930 15:12:35.014214 4763 scope.go:117] "RemoveContainer" containerID="13781fa7eecf5e7586de82f2051837927da1994ea3e89978d33596bd9b26e154" Sep 30 15:12:35 crc kubenswrapper[4763]: I0930 15:12:35.061251 4763 scope.go:117] "RemoveContainer" containerID="85ea2a498dd708c3391dcb82b606a930090b4ad77ed41e25c536d13f0d2637b1" Sep 30 15:12:42 crc kubenswrapper[4763]: I0930 15:12:42.045944 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-txslw"] Sep 30 15:12:42 crc kubenswrapper[4763]: I0930 15:12:42.052895 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-txslw"] Sep 30 15:12:42 crc kubenswrapper[4763]: I0930 15:12:42.499502 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b793587a-a139-454e-9837-a388a88f9129" path="/var/lib/kubelet/pods/b793587a-a139-454e-9837-a388a88f9129/volumes" Sep 30 15:12:48 crc kubenswrapper[4763]: I0930 15:12:48.501388 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:12:48 crc kubenswrapper[4763]: E0930 15:12:48.510758 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:13:02 crc kubenswrapper[4763]: I0930 15:13:02.489153 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:13:02 crc kubenswrapper[4763]: E0930 15:13:02.490108 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:13:17 crc kubenswrapper[4763]: I0930 15:13:17.489643 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:13:17 crc kubenswrapper[4763]: E0930 15:13:17.491191 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:13:30 crc kubenswrapper[4763]: I0930 15:13:30.489692 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:13:30 crc kubenswrapper[4763]: E0930 15:13:30.490343 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:13:35 crc kubenswrapper[4763]: I0930 15:13:35.151953 4763 scope.go:117] "RemoveContainer" containerID="9a4b2e6b3eacaed3f798846d7d67802d3c12ac219becc86f971e03e25c2a2d8e" Sep 30 15:13:43 crc kubenswrapper[4763]: I0930 15:13:43.489327 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:13:43 crc kubenswrapper[4763]: E0930 15:13:43.490055 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:13:55 crc kubenswrapper[4763]: I0930 15:13:55.489936 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:13:55 crc kubenswrapper[4763]: E0930 15:13:55.491394 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-49jns_openshift-machine-config-operator(e3789557-abc5-4243-9049-4afe8717cdf9)\"" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" Sep 30 15:14:10 crc kubenswrapper[4763]: I0930 15:14:10.490110 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44" Sep 30 15:14:10 crc kubenswrapper[4763]: I0930 15:14:10.895546 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"f293d7a11add6d24ec5417ff9c628d9e061fb881b0afc7494825f5b3a1b852db"} Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.453417 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-746j5"] Sep 30 15:14:14 crc kubenswrapper[4763]: E0930 15:14:14.454344 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb83c05-8409-40c0-922b-0453b881bd65" containerName="registry-server" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.454360 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb83c05-8409-40c0-922b-0453b881bd65" containerName="registry-server" Sep 30 15:14:14 crc kubenswrapper[4763]: E0930 15:14:14.454379 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerName="extract-utilities" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.454387 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerName="extract-utilities" Sep 30 15:14:14 crc kubenswrapper[4763]: E0930 15:14:14.454403 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerName="registry-server" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.454411 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerName="registry-server" Sep 30 15:14:14 crc kubenswrapper[4763]: E0930 15:14:14.454427 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerName="extract-content" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.454435 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerName="extract-content" Sep 30 15:14:14 crc kubenswrapper[4763]: E0930 15:14:14.454455 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb83c05-8409-40c0-922b-0453b881bd65" containerName="extract-content" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.454461 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb83c05-8409-40c0-922b-0453b881bd65" containerName="extract-content" Sep 30 15:14:14 crc kubenswrapper[4763]: E0930 15:14:14.454474 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb83c05-8409-40c0-922b-0453b881bd65" containerName="extract-utilities" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.454511 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb83c05-8409-40c0-922b-0453b881bd65" containerName="extract-utilities" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.454715 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb83c05-8409-40c0-922b-0453b881bd65" containerName="registry-server" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.454749 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="89bc7891-fe56-4c82-9c4e-5baf3ac7b13c" containerName="registry-server" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.457070 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.466541 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-746j5"] Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.539722 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc9sr\" (UniqueName: \"kubernetes.io/projected/5913b42e-330b-4a40-bb1b-3567aec71798-kube-api-access-gc9sr\") pod \"redhat-operators-746j5\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.539795 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-catalog-content\") pod \"redhat-operators-746j5\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.540120 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-utilities\") pod \"redhat-operators-746j5\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.641221 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc9sr\" (UniqueName: \"kubernetes.io/projected/5913b42e-330b-4a40-bb1b-3567aec71798-kube-api-access-gc9sr\") pod \"redhat-operators-746j5\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.641287 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-catalog-content\") pod \"redhat-operators-746j5\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.641387 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-utilities\") pod \"redhat-operators-746j5\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.641822 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-utilities\") pod \"redhat-operators-746j5\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.641913 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-catalog-content\") pod \"redhat-operators-746j5\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.669533 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc9sr\" (UniqueName: \"kubernetes.io/projected/5913b42e-330b-4a40-bb1b-3567aec71798-kube-api-access-gc9sr\") pod \"redhat-operators-746j5\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:14 crc kubenswrapper[4763]: I0930 15:14:14.775106 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:15 crc kubenswrapper[4763]: I0930 15:14:15.318659 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-746j5"] Sep 30 15:14:15 crc kubenswrapper[4763]: I0930 15:14:15.974915 4763 generic.go:334] "Generic (PLEG): container finished" podID="5913b42e-330b-4a40-bb1b-3567aec71798" containerID="ffe8111438d7ccdd2e70318c6fd3a447166a52617fef1d70ea60e9c643a4314b" exitCode=0 Sep 30 15:14:15 crc kubenswrapper[4763]: I0930 15:14:15.975203 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746j5" event={"ID":"5913b42e-330b-4a40-bb1b-3567aec71798","Type":"ContainerDied","Data":"ffe8111438d7ccdd2e70318c6fd3a447166a52617fef1d70ea60e9c643a4314b"} Sep 30 15:14:15 crc kubenswrapper[4763]: I0930 15:14:15.975319 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746j5" event={"ID":"5913b42e-330b-4a40-bb1b-3567aec71798","Type":"ContainerStarted","Data":"7787afe35ecf605dcb518be8864db68dc2a397775766fde93270c42bb9c6c91c"} Sep 30 15:14:15 crc kubenswrapper[4763]: I0930 15:14:15.977668 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 15:14:16 crc kubenswrapper[4763]: I0930 15:14:16.985926 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746j5" event={"ID":"5913b42e-330b-4a40-bb1b-3567aec71798","Type":"ContainerStarted","Data":"43f40a23371e83e2b3e4f7de85a4ee291fad931bb153634e8f1b498bc28d9780"} Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.452678 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-96lk6"] Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.456022 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.471777 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-96lk6"] Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.597427 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njgjp\" (UniqueName: \"kubernetes.io/projected/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-kube-api-access-njgjp\") pod \"redhat-marketplace-96lk6\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.597562 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-utilities\") pod \"redhat-marketplace-96lk6\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.597634 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-catalog-content\") pod \"redhat-marketplace-96lk6\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.699581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-utilities\") pod \"redhat-marketplace-96lk6\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.699689 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-catalog-content\") pod \"redhat-marketplace-96lk6\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.699751 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njgjp\" (UniqueName: \"kubernetes.io/projected/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-kube-api-access-njgjp\") pod \"redhat-marketplace-96lk6\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.700178 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-utilities\") pod \"redhat-marketplace-96lk6\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.700189 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-catalog-content\") pod \"redhat-marketplace-96lk6\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.725273 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njgjp\" (UniqueName: \"kubernetes.io/projected/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-kube-api-access-njgjp\") pod \"redhat-marketplace-96lk6\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.787840 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.996337 4763 generic.go:334] "Generic (PLEG): container finished" podID="5913b42e-330b-4a40-bb1b-3567aec71798" containerID="43f40a23371e83e2b3e4f7de85a4ee291fad931bb153634e8f1b498bc28d9780" exitCode=0 Sep 30 15:14:17 crc kubenswrapper[4763]: I0930 15:14:17.996406 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746j5" event={"ID":"5913b42e-330b-4a40-bb1b-3567aec71798","Type":"ContainerDied","Data":"43f40a23371e83e2b3e4f7de85a4ee291fad931bb153634e8f1b498bc28d9780"} Sep 30 15:14:18 crc kubenswrapper[4763]: I0930 15:14:18.262030 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-96lk6"] Sep 30 15:14:19 crc kubenswrapper[4763]: I0930 15:14:19.008129 4763 generic.go:334] "Generic (PLEG): container finished" podID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerID="e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213" exitCode=0 Sep 30 15:14:19 crc kubenswrapper[4763]: I0930 15:14:19.008541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96lk6" event={"ID":"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23","Type":"ContainerDied","Data":"e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213"} Sep 30 15:14:19 crc kubenswrapper[4763]: I0930 15:14:19.008642 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96lk6" event={"ID":"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23","Type":"ContainerStarted","Data":"a9bbad6407f759ea4115a6de0ce9d3ef0067b329510e109f97b357894404ae08"} Sep 30 15:14:19 crc kubenswrapper[4763]: I0930 15:14:19.018957 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746j5" event={"ID":"5913b42e-330b-4a40-bb1b-3567aec71798","Type":"ContainerStarted","Data":"e9e47cafa5f65bf3f6381404d0b4d49853d5e3e65db0dd6b4ef055716d7c4533"} Sep 30 15:14:19 crc kubenswrapper[4763]: I0930 15:14:19.059679 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-746j5" podStartSLOduration=2.646124294 podStartE2EDuration="5.059657764s" podCreationTimestamp="2025-09-30 15:14:14 +0000 UTC" firstStartedPulling="2025-09-30 15:14:15.977403498 +0000 UTC m=+5928.115963783" lastFinishedPulling="2025-09-30 15:14:18.390936968 +0000 UTC m=+5930.529497253" observedRunningTime="2025-09-30 15:14:19.049045349 +0000 UTC m=+5931.187605674" watchObservedRunningTime="2025-09-30 15:14:19.059657764 +0000 UTC m=+5931.198218049" Sep 30 15:14:20 crc kubenswrapper[4763]: I0930 15:14:20.029308 4763 generic.go:334] "Generic (PLEG): container finished" podID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerID="0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654" exitCode=0 Sep 30 15:14:20 crc kubenswrapper[4763]: I0930 15:14:20.029375 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96lk6" event={"ID":"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23","Type":"ContainerDied","Data":"0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654"} Sep 30 15:14:21 crc kubenswrapper[4763]: I0930 15:14:21.039877 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96lk6" event={"ID":"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23","Type":"ContainerStarted","Data":"54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4"} Sep 30 15:14:21 crc kubenswrapper[4763]: I0930 15:14:21.062012 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-96lk6" podStartSLOduration=2.647666175 podStartE2EDuration="4.061984194s" podCreationTimestamp="2025-09-30 15:14:17 +0000 UTC" firstStartedPulling="2025-09-30 15:14:19.012809204 +0000 UTC m=+5931.151369489" lastFinishedPulling="2025-09-30 15:14:20.427127213 +0000 UTC m=+5932.565687508" observedRunningTime="2025-09-30 15:14:21.05782485 +0000 UTC m=+5933.196385165" watchObservedRunningTime="2025-09-30 15:14:21.061984194 +0000 UTC m=+5933.200544499" Sep 30 15:14:24 crc kubenswrapper[4763]: I0930 15:14:24.775729 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:24 crc kubenswrapper[4763]: I0930 15:14:24.776132 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:24 crc kubenswrapper[4763]: I0930 15:14:24.860100 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:25 crc kubenswrapper[4763]: I0930 15:14:25.129097 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:25 crc kubenswrapper[4763]: I0930 15:14:25.244081 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-746j5"] Sep 30 15:14:27 crc kubenswrapper[4763]: I0930 15:14:27.094676 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-746j5" podUID="5913b42e-330b-4a40-bb1b-3567aec71798" containerName="registry-server" containerID="cri-o://e9e47cafa5f65bf3f6381404d0b4d49853d5e3e65db0dd6b4ef055716d7c4533" gracePeriod=2 Sep 30 15:14:27 crc kubenswrapper[4763]: I0930 15:14:27.788353 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:27 crc kubenswrapper[4763]: I0930 15:14:27.788790 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:27 crc kubenswrapper[4763]: I0930 15:14:27.846027 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:28 crc kubenswrapper[4763]: I0930 15:14:28.158559 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:28 crc kubenswrapper[4763]: I0930 15:14:28.642154 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-96lk6"] Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.117020 4763 generic.go:334] "Generic (PLEG): container finished" podID="5913b42e-330b-4a40-bb1b-3567aec71798" containerID="e9e47cafa5f65bf3f6381404d0b4d49853d5e3e65db0dd6b4ef055716d7c4533" exitCode=0 Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.117126 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746j5" event={"ID":"5913b42e-330b-4a40-bb1b-3567aec71798","Type":"ContainerDied","Data":"e9e47cafa5f65bf3f6381404d0b4d49853d5e3e65db0dd6b4ef055716d7c4533"} Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.414511 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.600703 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-catalog-content\") pod \"5913b42e-330b-4a40-bb1b-3567aec71798\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.600829 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-utilities\") pod \"5913b42e-330b-4a40-bb1b-3567aec71798\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.600875 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc9sr\" (UniqueName: \"kubernetes.io/projected/5913b42e-330b-4a40-bb1b-3567aec71798-kube-api-access-gc9sr\") pod \"5913b42e-330b-4a40-bb1b-3567aec71798\" (UID: \"5913b42e-330b-4a40-bb1b-3567aec71798\") " Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.601842 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-utilities" (OuterVolumeSpecName: "utilities") pod "5913b42e-330b-4a40-bb1b-3567aec71798" (UID: "5913b42e-330b-4a40-bb1b-3567aec71798"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.607934 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5913b42e-330b-4a40-bb1b-3567aec71798-kube-api-access-gc9sr" (OuterVolumeSpecName: "kube-api-access-gc9sr") pod "5913b42e-330b-4a40-bb1b-3567aec71798" (UID: "5913b42e-330b-4a40-bb1b-3567aec71798"). InnerVolumeSpecName "kube-api-access-gc9sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.702906 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.702946 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc9sr\" (UniqueName: \"kubernetes.io/projected/5913b42e-330b-4a40-bb1b-3567aec71798-kube-api-access-gc9sr\") on node \"crc\" DevicePath \"\"" Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.709242 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5913b42e-330b-4a40-bb1b-3567aec71798" (UID: "5913b42e-330b-4a40-bb1b-3567aec71798"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:14:29 crc kubenswrapper[4763]: I0930 15:14:29.804481 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5913b42e-330b-4a40-bb1b-3567aec71798-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.128750 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746j5" event={"ID":"5913b42e-330b-4a40-bb1b-3567aec71798","Type":"ContainerDied","Data":"7787afe35ecf605dcb518be8864db68dc2a397775766fde93270c42bb9c6c91c"} Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.128842 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-96lk6" podUID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerName="registry-server" containerID="cri-o://54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4" gracePeriod=2 Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.128860 4763 scope.go:117] "RemoveContainer" containerID="e9e47cafa5f65bf3f6381404d0b4d49853d5e3e65db0dd6b4ef055716d7c4533" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.128785 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-746j5" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.162249 4763 scope.go:117] "RemoveContainer" containerID="43f40a23371e83e2b3e4f7de85a4ee291fad931bb153634e8f1b498bc28d9780" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.175251 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-746j5"] Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.183711 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-746j5"] Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.255773 4763 scope.go:117] "RemoveContainer" containerID="ffe8111438d7ccdd2e70318c6fd3a447166a52617fef1d70ea60e9c643a4314b" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.498243 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5913b42e-330b-4a40-bb1b-3567aec71798" path="/var/lib/kubelet/pods/5913b42e-330b-4a40-bb1b-3567aec71798/volumes" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.589379 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.720573 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njgjp\" (UniqueName: \"kubernetes.io/projected/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-kube-api-access-njgjp\") pod \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.720666 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-catalog-content\") pod \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.720770 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-utilities\") pod \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\" (UID: \"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23\") " Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.721615 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-utilities" (OuterVolumeSpecName: "utilities") pod "2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" (UID: "2a5ae9c6-cab2-4385-a3e5-64bf395e5e23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.727408 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-kube-api-access-njgjp" (OuterVolumeSpecName: "kube-api-access-njgjp") pod "2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" (UID: "2a5ae9c6-cab2-4385-a3e5-64bf395e5e23"). InnerVolumeSpecName "kube-api-access-njgjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.733645 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" (UID: "2a5ae9c6-cab2-4385-a3e5-64bf395e5e23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.823199 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.823249 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njgjp\" (UniqueName: \"kubernetes.io/projected/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-kube-api-access-njgjp\") on node \"crc\" DevicePath \"\"" Sep 30 15:14:30 crc kubenswrapper[4763]: I0930 15:14:30.823263 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.141111 4763 generic.go:334] "Generic (PLEG): container finished" podID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerID="54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4" exitCode=0 Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.141205 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96lk6" event={"ID":"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23","Type":"ContainerDied","Data":"54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4"} Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.141249 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96lk6" event={"ID":"2a5ae9c6-cab2-4385-a3e5-64bf395e5e23","Type":"ContainerDied","Data":"a9bbad6407f759ea4115a6de0ce9d3ef0067b329510e109f97b357894404ae08"} Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.141274 4763 scope.go:117] "RemoveContainer" containerID="54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4" Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.141438 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96lk6" Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.166454 4763 scope.go:117] "RemoveContainer" containerID="0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654" Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.190053 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-96lk6"] Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.199382 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-96lk6"] Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.203335 4763 scope.go:117] "RemoveContainer" containerID="e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213" Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.258338 4763 scope.go:117] "RemoveContainer" containerID="54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4" Sep 30 15:14:31 crc kubenswrapper[4763]: E0930 15:14:31.259097 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4\": container with ID starting with 54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4 not found: ID does not exist" containerID="54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4" Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.259303 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4"} err="failed to get container status \"54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4\": rpc error: code = NotFound desc = could not find container \"54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4\": container with ID starting with 54d9b8e83721dc738a0f2b9ea9f124703888022c3d8b3b3faecfb4baa60743b4 not found: ID does not exist" Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.259430 4763 scope.go:117] "RemoveContainer" containerID="0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654" Sep 30 15:14:31 crc kubenswrapper[4763]: E0930 15:14:31.260245 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654\": container with ID starting with 0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654 not found: ID does not exist" containerID="0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654" Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.260284 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654"} err="failed to get container status \"0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654\": rpc error: code = NotFound desc = could not find container \"0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654\": container with ID starting with 0614a829517b85283728ff8770aca095f2155a0ae80e65d3391b8b078f728654 not found: ID does not exist" Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.260314 4763 scope.go:117] "RemoveContainer" containerID="e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213" Sep 30 15:14:31 crc kubenswrapper[4763]: E0930 15:14:31.261129 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213\": container with ID starting with e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213 not found: ID does not exist" containerID="e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213" Sep 30 15:14:31 crc kubenswrapper[4763]: I0930 15:14:31.261167 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213"} err="failed to get container status \"e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213\": rpc error: code = NotFound desc = could not find container \"e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213\": container with ID starting with e078efb7670544ab6ad4a558c60d6bb840c7ce084f18f490f983759052c70213 not found: ID does not exist" Sep 30 15:14:32 crc kubenswrapper[4763]: I0930 15:14:32.507369 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" path="/var/lib/kubelet/pods/2a5ae9c6-cab2-4385-a3e5-64bf395e5e23/volumes" Sep 30 15:14:35 crc kubenswrapper[4763]: I0930 15:14:35.222533 4763 scope.go:117] "RemoveContainer" containerID="9935cda85c78352ff4f8a59f7a9f890350df51a5df7daed61ffe7ee6cc55287a" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.153274 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth"] Sep 30 15:15:00 crc kubenswrapper[4763]: E0930 15:15:00.154269 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5913b42e-330b-4a40-bb1b-3567aec71798" containerName="registry-server" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.154285 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5913b42e-330b-4a40-bb1b-3567aec71798" containerName="registry-server" Sep 30 15:15:00 crc kubenswrapper[4763]: E0930 15:15:00.154294 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerName="registry-server" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.154302 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerName="registry-server" Sep 30 15:15:00 crc kubenswrapper[4763]: E0930 15:15:00.154320 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerName="extract-utilities" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.154327 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerName="extract-utilities" Sep 30 15:15:00 crc kubenswrapper[4763]: E0930 15:15:00.154342 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5913b42e-330b-4a40-bb1b-3567aec71798" containerName="extract-utilities" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.154349 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5913b42e-330b-4a40-bb1b-3567aec71798" containerName="extract-utilities" Sep 30 15:15:00 crc kubenswrapper[4763]: E0930 15:15:00.154369 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5913b42e-330b-4a40-bb1b-3567aec71798" containerName="extract-content" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.154376 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5913b42e-330b-4a40-bb1b-3567aec71798" containerName="extract-content" Sep 30 15:15:00 crc kubenswrapper[4763]: E0930 15:15:00.154396 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerName="extract-content" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.154403 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerName="extract-content" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.154583 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a5ae9c6-cab2-4385-a3e5-64bf395e5e23" containerName="registry-server" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.154635 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5913b42e-330b-4a40-bb1b-3567aec71798" containerName="registry-server" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.155341 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.160111 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.160435 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.165896 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth"] Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.301949 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69dc8ef9-0a6a-40d0-8073-fa713489f186-secret-volume\") pod \"collect-profiles-29320755-lvcth\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.302287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69dc8ef9-0a6a-40d0-8073-fa713489f186-config-volume\") pod \"collect-profiles-29320755-lvcth\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.302518 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ck27\" (UniqueName: \"kubernetes.io/projected/69dc8ef9-0a6a-40d0-8073-fa713489f186-kube-api-access-2ck27\") pod \"collect-profiles-29320755-lvcth\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.406139 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69dc8ef9-0a6a-40d0-8073-fa713489f186-secret-volume\") pod \"collect-profiles-29320755-lvcth\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.406465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69dc8ef9-0a6a-40d0-8073-fa713489f186-config-volume\") pod \"collect-profiles-29320755-lvcth\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.406578 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ck27\" (UniqueName: \"kubernetes.io/projected/69dc8ef9-0a6a-40d0-8073-fa713489f186-kube-api-access-2ck27\") pod \"collect-profiles-29320755-lvcth\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.407519 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69dc8ef9-0a6a-40d0-8073-fa713489f186-config-volume\") pod \"collect-profiles-29320755-lvcth\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.426250 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69dc8ef9-0a6a-40d0-8073-fa713489f186-secret-volume\") pod \"collect-profiles-29320755-lvcth\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.430188 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ck27\" (UniqueName: \"kubernetes.io/projected/69dc8ef9-0a6a-40d0-8073-fa713489f186-kube-api-access-2ck27\") pod \"collect-profiles-29320755-lvcth\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.479038 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:00 crc kubenswrapper[4763]: I0930 15:15:00.903860 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth"] Sep 30 15:15:00 crc kubenswrapper[4763]: W0930 15:15:00.915100 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69dc8ef9_0a6a_40d0_8073_fa713489f186.slice/crio-e5aa6f06e3e34d61cb88dffb9f9283bacc61a92377f64d2be0bd9fde3c1ce4d8 WatchSource:0}: Error finding container e5aa6f06e3e34d61cb88dffb9f9283bacc61a92377f64d2be0bd9fde3c1ce4d8: Status 404 returned error can't find the container with id e5aa6f06e3e34d61cb88dffb9f9283bacc61a92377f64d2be0bd9fde3c1ce4d8 Sep 30 15:15:01 crc kubenswrapper[4763]: I0930 15:15:01.441029 4763 generic.go:334] "Generic (PLEG): container finished" podID="69dc8ef9-0a6a-40d0-8073-fa713489f186" containerID="b45fe8fd7686a0b1dc795cda1f2ffbd33e5fbac55b858e4758243fa3e431f4a0" exitCode=0 Sep 30 15:15:01 crc kubenswrapper[4763]: I0930 15:15:01.441089 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" event={"ID":"69dc8ef9-0a6a-40d0-8073-fa713489f186","Type":"ContainerDied","Data":"b45fe8fd7686a0b1dc795cda1f2ffbd33e5fbac55b858e4758243fa3e431f4a0"} Sep 30 15:15:01 crc kubenswrapper[4763]: I0930 15:15:01.441121 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" event={"ID":"69dc8ef9-0a6a-40d0-8073-fa713489f186","Type":"ContainerStarted","Data":"e5aa6f06e3e34d61cb88dffb9f9283bacc61a92377f64d2be0bd9fde3c1ce4d8"} Sep 30 15:15:02 crc kubenswrapper[4763]: I0930 15:15:02.771509 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:02 crc kubenswrapper[4763]: I0930 15:15:02.952217 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69dc8ef9-0a6a-40d0-8073-fa713489f186-secret-volume\") pod \"69dc8ef9-0a6a-40d0-8073-fa713489f186\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " Sep 30 15:15:02 crc kubenswrapper[4763]: I0930 15:15:02.952376 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69dc8ef9-0a6a-40d0-8073-fa713489f186-config-volume\") pod \"69dc8ef9-0a6a-40d0-8073-fa713489f186\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " Sep 30 15:15:02 crc kubenswrapper[4763]: I0930 15:15:02.952415 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ck27\" (UniqueName: \"kubernetes.io/projected/69dc8ef9-0a6a-40d0-8073-fa713489f186-kube-api-access-2ck27\") pod \"69dc8ef9-0a6a-40d0-8073-fa713489f186\" (UID: \"69dc8ef9-0a6a-40d0-8073-fa713489f186\") " Sep 30 15:15:02 crc kubenswrapper[4763]: I0930 15:15:02.953232 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69dc8ef9-0a6a-40d0-8073-fa713489f186-config-volume" (OuterVolumeSpecName: "config-volume") pod "69dc8ef9-0a6a-40d0-8073-fa713489f186" (UID: "69dc8ef9-0a6a-40d0-8073-fa713489f186"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:15:02 crc kubenswrapper[4763]: I0930 15:15:02.959855 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69dc8ef9-0a6a-40d0-8073-fa713489f186-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "69dc8ef9-0a6a-40d0-8073-fa713489f186" (UID: "69dc8ef9-0a6a-40d0-8073-fa713489f186"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:15:02 crc kubenswrapper[4763]: I0930 15:15:02.962125 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69dc8ef9-0a6a-40d0-8073-fa713489f186-kube-api-access-2ck27" (OuterVolumeSpecName: "kube-api-access-2ck27") pod "69dc8ef9-0a6a-40d0-8073-fa713489f186" (UID: "69dc8ef9-0a6a-40d0-8073-fa713489f186"). InnerVolumeSpecName "kube-api-access-2ck27". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:15:03 crc kubenswrapper[4763]: I0930 15:15:03.055050 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69dc8ef9-0a6a-40d0-8073-fa713489f186-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 15:15:03 crc kubenswrapper[4763]: I0930 15:15:03.055107 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69dc8ef9-0a6a-40d0-8073-fa713489f186-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 15:15:03 crc kubenswrapper[4763]: I0930 15:15:03.055118 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ck27\" (UniqueName: \"kubernetes.io/projected/69dc8ef9-0a6a-40d0-8073-fa713489f186-kube-api-access-2ck27\") on node \"crc\" DevicePath \"\"" Sep 30 15:15:03 crc kubenswrapper[4763]: I0930 15:15:03.462395 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" event={"ID":"69dc8ef9-0a6a-40d0-8073-fa713489f186","Type":"ContainerDied","Data":"e5aa6f06e3e34d61cb88dffb9f9283bacc61a92377f64d2be0bd9fde3c1ce4d8"} Sep 30 15:15:03 crc kubenswrapper[4763]: I0930 15:15:03.462442 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5aa6f06e3e34d61cb88dffb9f9283bacc61a92377f64d2be0bd9fde3c1ce4d8" Sep 30 15:15:03 crc kubenswrapper[4763]: I0930 15:15:03.462500 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-lvcth" Sep 30 15:15:03 crc kubenswrapper[4763]: I0930 15:15:03.855513 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc"] Sep 30 15:15:03 crc kubenswrapper[4763]: I0930 15:15:03.863816 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-5zlpc"] Sep 30 15:15:04 crc kubenswrapper[4763]: I0930 15:15:04.519154 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74fac9a6-d13e-48f9-a502-2026c9d71525" path="/var/lib/kubelet/pods/74fac9a6-d13e-48f9-a502-2026c9d71525/volumes" Sep 30 15:15:35 crc kubenswrapper[4763]: I0930 15:15:35.323305 4763 scope.go:117] "RemoveContainer" containerID="53315db663bab298ee6ab16fa5c5ec759a8e31781865a029f942c8cf381b077b" Sep 30 15:15:48 crc kubenswrapper[4763]: I0930 15:15:48.873566 4763 generic.go:334] "Generic (PLEG): container finished" podID="00acb94e-4228-4d1d-9f74-1856acbc9d71" containerID="fd447ad5c6030202550f7eaef684f6dd7b437341fee2afff64869eb8dde8ca2b" exitCode=0 Sep 30 15:15:48 crc kubenswrapper[4763]: I0930 15:15:48.873682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-68kkj/must-gather-dmdrl" event={"ID":"00acb94e-4228-4d1d-9f74-1856acbc9d71","Type":"ContainerDied","Data":"fd447ad5c6030202550f7eaef684f6dd7b437341fee2afff64869eb8dde8ca2b"} Sep 30 15:15:48 crc kubenswrapper[4763]: I0930 15:15:48.875658 4763 scope.go:117] "RemoveContainer" containerID="fd447ad5c6030202550f7eaef684f6dd7b437341fee2afff64869eb8dde8ca2b" Sep 30 15:15:48 crc kubenswrapper[4763]: I0930 15:15:48.929887 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-68kkj_must-gather-dmdrl_00acb94e-4228-4d1d-9f74-1856acbc9d71/gather/0.log" Sep 30 15:15:56 crc kubenswrapper[4763]: I0930 15:15:56.580196 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-68kkj/must-gather-dmdrl"] Sep 30 15:15:56 crc kubenswrapper[4763]: I0930 15:15:56.581057 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-68kkj/must-gather-dmdrl" podUID="00acb94e-4228-4d1d-9f74-1856acbc9d71" containerName="copy" containerID="cri-o://c419163f3556c86546e46b86dde4b76a3cc62d84da4e3d9ef1f6ebb0db00133e" gracePeriod=2 Sep 30 15:15:56 crc kubenswrapper[4763]: I0930 15:15:56.588001 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-68kkj/must-gather-dmdrl"] Sep 30 15:15:56 crc kubenswrapper[4763]: I0930 15:15:56.952352 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-68kkj_must-gather-dmdrl_00acb94e-4228-4d1d-9f74-1856acbc9d71/copy/0.log" Sep 30 15:15:56 crc kubenswrapper[4763]: I0930 15:15:56.953197 4763 generic.go:334] "Generic (PLEG): container finished" podID="00acb94e-4228-4d1d-9f74-1856acbc9d71" containerID="c419163f3556c86546e46b86dde4b76a3cc62d84da4e3d9ef1f6ebb0db00133e" exitCode=143 Sep 30 15:15:56 crc kubenswrapper[4763]: I0930 15:15:56.953259 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab5914efc06ca547ec23c739591c6fc690bd641ef4c4b3f18f2c5a402f3ed3ca" Sep 30 15:15:56 crc kubenswrapper[4763]: I0930 15:15:56.962207 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-68kkj_must-gather-dmdrl_00acb94e-4228-4d1d-9f74-1856acbc9d71/copy/0.log" Sep 30 15:15:56 crc kubenswrapper[4763]: I0930 15:15:56.962662 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/must-gather-dmdrl" Sep 30 15:15:57 crc kubenswrapper[4763]: I0930 15:15:57.046092 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/00acb94e-4228-4d1d-9f74-1856acbc9d71-must-gather-output\") pod \"00acb94e-4228-4d1d-9f74-1856acbc9d71\" (UID: \"00acb94e-4228-4d1d-9f74-1856acbc9d71\") " Sep 30 15:15:57 crc kubenswrapper[4763]: I0930 15:15:57.046207 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd552\" (UniqueName: \"kubernetes.io/projected/00acb94e-4228-4d1d-9f74-1856acbc9d71-kube-api-access-wd552\") pod \"00acb94e-4228-4d1d-9f74-1856acbc9d71\" (UID: \"00acb94e-4228-4d1d-9f74-1856acbc9d71\") " Sep 30 15:15:57 crc kubenswrapper[4763]: I0930 15:15:57.053799 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00acb94e-4228-4d1d-9f74-1856acbc9d71-kube-api-access-wd552" (OuterVolumeSpecName: "kube-api-access-wd552") pod "00acb94e-4228-4d1d-9f74-1856acbc9d71" (UID: "00acb94e-4228-4d1d-9f74-1856acbc9d71"). InnerVolumeSpecName "kube-api-access-wd552". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:15:57 crc kubenswrapper[4763]: I0930 15:15:57.148491 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd552\" (UniqueName: \"kubernetes.io/projected/00acb94e-4228-4d1d-9f74-1856acbc9d71-kube-api-access-wd552\") on node \"crc\" DevicePath \"\"" Sep 30 15:15:57 crc kubenswrapper[4763]: I0930 15:15:57.181256 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00acb94e-4228-4d1d-9f74-1856acbc9d71-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "00acb94e-4228-4d1d-9f74-1856acbc9d71" (UID: "00acb94e-4228-4d1d-9f74-1856acbc9d71"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:15:57 crc kubenswrapper[4763]: I0930 15:15:57.250362 4763 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/00acb94e-4228-4d1d-9f74-1856acbc9d71-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 15:15:57 crc kubenswrapper[4763]: I0930 15:15:57.959549 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-68kkj/must-gather-dmdrl" Sep 30 15:15:58 crc kubenswrapper[4763]: I0930 15:15:58.515227 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00acb94e-4228-4d1d-9f74-1856acbc9d71" path="/var/lib/kubelet/pods/00acb94e-4228-4d1d-9f74-1856acbc9d71/volumes" Sep 30 15:16:35 crc kubenswrapper[4763]: I0930 15:16:35.379445 4763 scope.go:117] "RemoveContainer" containerID="fd447ad5c6030202550f7eaef684f6dd7b437341fee2afff64869eb8dde8ca2b" Sep 30 15:16:35 crc kubenswrapper[4763]: I0930 15:16:35.444703 4763 scope.go:117] "RemoveContainer" containerID="c419163f3556c86546e46b86dde4b76a3cc62d84da4e3d9ef1f6ebb0db00133e" Sep 30 15:16:36 crc kubenswrapper[4763]: I0930 15:16:36.060315 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:16:36 crc kubenswrapper[4763]: I0930 15:16:36.060747 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:17:06 crc kubenswrapper[4763]: I0930 15:17:06.060642 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:17:06 crc kubenswrapper[4763]: I0930 15:17:06.061243 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:17:36 crc kubenswrapper[4763]: I0930 15:17:36.059371 4763 patch_prober.go:28] interesting pod/machine-config-daemon-49jns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:17:36 crc kubenswrapper[4763]: I0930 15:17:36.059968 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:17:36 crc kubenswrapper[4763]: I0930 15:17:36.060024 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-49jns" Sep 30 15:17:36 crc kubenswrapper[4763]: I0930 15:17:36.060794 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f293d7a11add6d24ec5417ff9c628d9e061fb881b0afc7494825f5b3a1b852db"} pod="openshift-machine-config-operator/machine-config-daemon-49jns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 15:17:36 crc kubenswrapper[4763]: I0930 15:17:36.060861 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-49jns" podUID="e3789557-abc5-4243-9049-4afe8717cdf9" containerName="machine-config-daemon" containerID="cri-o://f293d7a11add6d24ec5417ff9c628d9e061fb881b0afc7494825f5b3a1b852db" gracePeriod=600 Sep 30 15:17:36 crc kubenswrapper[4763]: I0930 15:17:36.771139 4763 generic.go:334] "Generic (PLEG): container finished" podID="e3789557-abc5-4243-9049-4afe8717cdf9" containerID="f293d7a11add6d24ec5417ff9c628d9e061fb881b0afc7494825f5b3a1b852db" exitCode=0 Sep 30 15:17:36 crc kubenswrapper[4763]: I0930 15:17:36.771238 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerDied","Data":"f293d7a11add6d24ec5417ff9c628d9e061fb881b0afc7494825f5b3a1b852db"} Sep 30 15:17:36 crc kubenswrapper[4763]: I0930 15:17:36.771550 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-49jns" event={"ID":"e3789557-abc5-4243-9049-4afe8717cdf9","Type":"ContainerStarted","Data":"ef1453a047fd1757f19712301958f9186307edf65eca54d470e8e30e01fad232"} Sep 30 15:17:36 crc kubenswrapper[4763]: I0930 15:17:36.771614 4763 scope.go:117] "RemoveContainer" containerID="aefbb9787a5240671a834d75c24d45e597dfa3c1adbcd20f7c3d69e7ab7dfa44"